Ideatrotter: Disruptive 2.0 Intelligence

" Until recently, memory problems indicated a deficiency in personal character, a shortage of “ethics or humanity.” This outlook was a sign of the times: Informational scarcity fueled an ethos of individualism. Today, advances in technology and technique enable vast quantities of networked information to be stored and retrieved cheaply, simply, and reliably. Information abundance fuels its own ethos where interdependency and mediation take center stage. Go to a party and brag about your ability to recall contact information. Nobody will toast your commitment to swimming against the tide of memory depletion. Instead, folks will tell you and your antiquated sensibilities to get a life and a smartphone. "

#quote #thought #memory #brain #future
9 notes

It is a well-known fact of cognitive science that human short-term memory (SM), when compared to other attributes of our memory systems, is exceedingly limited. This fact has been the focus of thousands of studies over the last 50 years. Scientists have poked and prodded this aspect of human cognition to determine exactly how SM operates and what impacts SM effectiveness. As we go about our daily lives, short-term memory makes it possible for you to engage with all manner of technology and the environment in general.

SM is a temporary memory that allows us to remember a very limited number of discrete items, behaviors, or patterns for a short period of time. SM makes it possible for you to operate without constant referral to long-term memory, a much more complex and time-consuming process. This is critical because SM is fast and easily configured, which allows one to adapt instantly to situations that might otherwise be fatal if one were required to access long-term memory. In computer-speak, human short-term memory is also highly volatile. This means it can be erased instantly, or more importantly, it can be overwritten by other information coming into the human perceptual system.

Where things get interesting is the point where poor user interface design impacts the demand placed on SM. For example, a user interface design solution that requires the user to view information on one screen, store it in short-term memory, and then reenter that same information in a data field on another screen seems like a trivial task. Research shows that it is difficult to do accurately, especially if some other form of stimulus flows between the memorization of the data from the first screen and before the user enters the data in the second.

This disruptive data flow can be in almost any form, but as a general rule, anything that is engaging, such as conversation, noise, motion, or worst of all, a combination of all three, is likely to totally erase SM. When you encounter this type of data flow before you complete transfer of data using short-term memory, chances are very good that when you go back to retrieve important information from short-term memory, it is gone!

One would logically assume that any aspect of user interface design that taxes short-term memory is a really bad idea. As was the case with response time, a more refined view leads to surprising insights into how one can use the degradation of short-term memory to actually improve game play engagement. Angry Birds is a surprisingly smart manager of the player’s short-term memory.

By simple manipulation of the user interface, Angry Birds designers created significant short-term memory loss, which in turn increases game play complexity but in a way that is not perceived by the player as negative and adds to the addictive nature of the game itself. The subtle, yet powerful concept employed in Angry Birds is to bend short-term memory but not to actually break it. If you do break SM, make sure you give the user a very simple, fast way to accurately reload.

(Source: underpaidgenius, via stoweboyd)

#gaming #cognitive computing #memory #brain #science #technology #tech
13 notes

IBM - SyNAPSE: a cognitive computing project from IBM Research

Beyond machines

For more than half a century, computers have been little better than calculators with storage structures and programmable memory, a model that scientists have continually aimed to improve.

Comparatively, the human brain—the world’s most sophisticated computer—can perform complex tasks rapidly and accurately using the same amount of energy as a 20 watt light bulb in a space equivalent to a 2 liter soda bottle.

Cognitive computing: thought for the future

Making sense of real-time input flowing in at a dizzying rate is a Herculean task for today’s computers, but would be natural for a brain-inspired system. Using advanced algorithms and silicon circuitry, cognitive computers learn through experiences, find correlations, create hypotheses, and remember—and learn from—the outcomes.

For example, a cognitive computing system monitoring the world’s water supply could contain a network of sensors and actuators that constantly record and report metrics such as temperature, pressure, wave height, acoustics and ocean tide, and issue tsunami warnings based on its decision making.

(via smarterplanet)

#cognitive computing #computers #memory
32 notes