In computers science, the main idea behind blurring a photo is to set the RGB value of every pixel to the average of all the values of its neighbours.
For more details on what I mean by "blur":
I'm wondering if there is a mathematical relation between eye power and blurriness in pixels - this is because I am trying to create a web app where users can in real time experience the vision of a myopic eye
I'm sorry if this is the wrong stack exchange site, just direct me to the right one! and also feel free to add the right tags cuz this site is new for me and I have a tough time deciding tags.
Thanks!
EDIT 1
CSS filter function This is what I'm going to be using and it says something about Gaussian that I don't really understand. Would be great if you can explain and not just throw an equation in the answer section thanks
One idea would be to take your input as visual acuity, the numbers like $20/20$ that come from a test. You could look up the size of the characters in the $20/20$ line. I would estimate that it takes about eight vertical pixels to resolve the letters, which gives you the angular size of the blur circle in a standard eye. Somebody who tests at $20/40$ needs the letters twice as large, so their blur circle is twice as large. I don't know if there is a standard conversion between lens power and acuity. It clearly depends on how bright the light is, because with brighter light the pupil will narrow and the blur circle will get smaller as the lens is operating at a smaller aperture.