Prop plane visual aberration

Hyperubik

CC BY 3.0 US

Never rely solely on your senses, they are imperfect and prone to be channeled through cognitive filters which have been configured using your experiences as input data.

Timothy Lee Russell | 1/11/2018 7:18:26 AM

Visual Aberrations

We rely on our senses everyday. We see, hear, taste, smell and touch the world around us but we are vulnerable to those same senses telling us lies for a host of reasons. Check in with the cognitive and neuroscience researchers to find out more about why this is true. I am not a subject matter expert in that area. My experience is as an entity consuming and reacting to sensory input, often in questionable ways.

We are complex sensors

There are many different senses ingrained in our makeup, some of those intents humanity has been able to replicate with technology over the past couple of hundred years. The sensors in our devices allow them to react to the physical universe as we do.

Power is a valuable resource so we want to make sure and conserve as much as is possible. With that in mind, how can we make a phone sentient enough to realize that it should turn off the screen when we put it in our pocket or set it down on a desk?

One technique is to test the status of a proximity sensor. For this use case, let's call it a sight sensor, the "Hey, who turned off the lights" or "What is that blurry, out-of-focus blob-obscuring-my-view" camera. If all of the sudden it is dark, the phone might be in a pocket! Or, it might be zero-dark-thirty in the middle of a desert with no lights. At the very moment when you need to check the map, your phone has determined that it is in your pocket and goes to sleep. This method is probably not sufficient on its own.

Another technique is to use an accelerometer (our sense of balance) to determine the phone's orientation. If you put it in your pocket upside down, the accelerometer can read the rotational motion as you go through the process of turning the phone upside down and then perhaps also take into account the deceleration and additionally the jolt as the phone hits the bottom of your pocket.

Surely there are many different techniques but regardless of the implementation, due to the high-density of sensors in modern devices we can use an aggregate of the data retrieved from multiple sensors. The multiple facets of data allow increasingly accurate, situation-specific decisions to be made.

Make conscious decisions and experience what is there to experience

The next time you experience something, take the time to consciously notice more information with additional senses. For example, type on a mechanical keyboard. If you are a touch-typist, you probably interact with the keyboard using two senses, touch (somatosensation) and hearing (auditory perception). You feel the keys as you depress them the appropriate depth, your position on the keyboard is informed tactilely, at least on a QWERTY keyboard with index-finger registration and the sound of the keys as you type different words create tiny unique songs.

Type the word "cat" three times. Now type the word "Cat" with an uppercase "C" three times. Listen closely to how they are different. When you add the Shift key, the tune changes. Now type the word "Mississippi" three times. Listen closely. Extra points for making an audio recording of yourself typing several words with the same number of characters and then seeing if you can differentiate the words using only your ears.

Add a sense. Type the word "dirt" but this time look carefully at the keys as you type. Let your eyes shift as each finger types a letter, move as if you are in extreme slow motion and watch your perspective of the physical geometry of the key change as it travels from a depressed to a pressed state. Also listen at this point. How does typing very slowly while actively engaging in the visual sense change the tune? Does it feel different to touch the keys at that speed? Extra credit for adding another sense. Smell would be fine but we absolutely do not recommend licking your keyboard.

Prop plane visual aberration

Hyperubik

CC BY 3.0 US

And now it gets weird with some role play

Imagine yourself, right now, being held in the hand of a giant. Now imagine that he turns you upside down and stuffs you in his back pocket. Remember the sensation of a corkscrew roller-coaster and then close your eyes, plunging into darkness with your arms pinned to your sides. Claustrophobic, pitch-black, hanging upside-down and wondering how you managed to get yourself in such a precarious situation.

Now give it a minute and really imagine how that would feel because that's what your phone goes through, most likely dozens to hundreds of times per day. All things considered, the way a phone tirelessly works for us is fairly magnanimous considering what we put them through.

We're sure you're normal...it's cool, either way.

Whether or not you are in the habit of anthropomorphizing your phone, pushing the power button is a drag. Gravity Screen - On/Off is a fantastic app for Android that makes your devices smarter by reducing the friction of turning the device on and off. A Company that Makes Everything™ uses software to improve business processes. In this case, it is to save 1-second many times per day. As is the case with most businesses, it doesn't make sense to build some things in-house. It falls under the category, "Apps I would have attempted to create myself", if a solid winner didn't already exist. It is the first application I install when provisioning a new Android device.

Blame the messenger

This story is all a preface to a situation two weeks ago where Gravity Screen was not working correctly. I assumed some sort of software bug. I contacted the developer and we never really figured it out. Fast forward to hearing about another app that gives you an amazing insight into the sensors on your phone by displaying the output of the sensors in real-time.

I installed Sensors Multitool and determined that my phone's proximity sensor was non-functional. I replaced the phone and Gravity Screen began working as expected again. Since Gravity Screen was depending on faulty underlying data provided by the phone, its decision-making process was flawed.

Perhaps sensor-sanity checks could be added to Gravity Screen in the future but in the interim, you can install Sensors Multitool and make sure your phone is firing on all cylinders.

The takeaway

If you want to make good decisions, you need accurate data. That means you need a process for validating that your data collection methods are returning results that make sense.

Let the context and careful inspection determine the validity of the incoming data and remember, a second opinion never hurts.

comments powered by Disqus