Data vs sensory perception

From Cyborg Anthropology
Jump to: navigation, search

Main question

how does our use of engineered interfaces affect our relationship with traditional sensory input?

Enhanced senses is a good topic, but not what I see as the main focus.

Mostly I'd like to talk about how our notion of input is changing currently..

Both in the very physical, the way we sense the physical world and how that changes as we get used to interfaces

Abstract

about

how our notion of what it means to get information via our 5 basic senses is changing, even without enhanced senses.

1. physical, how we get information about our environment.

what does it mean to get information about environment?

what happens--- transmitted from environment to any intermediary interfaces (sounds picked up by hearing aid) to the body

environment only supplying part of the information (ex/database of smell/taste linked to memory, visual object recognition). def. of sense (for this discussion) includes the interpretation that you do as part of processing sensory input

question not whether you're using hearing aid or eardrum (made of meat or technology), or whether your environment is natural or simulated. asking what is difference b/w case where your senses and your information processing systems, are the product of evolution and learning (apples on a tree) or product of a designed system of information distributing, designed method of information capturing....live in designed world, how that changes us.

can engineer environment, interfaces, body to change what's happening with your senses

  • can modify environment (ex/ immersive gaming experience or well-designed store)
  • 'modify' interfaces , they are all going to be products of design
  • modify body*
  • even inside body, how you use sensory information subject to design (hack automatic body responses with biofeedback, calibrate interpretation of senses themselves/experience of cold water)


filters- only using some of our ability to get information about world around us

not always aware of all senses-

constantly filtering out input from sensory systems...sorting relevant from not relevant (selective attention: skipping ad banners in visual scan of webpage, prioritizing: focusing on traffic signals vs. on the way socks feel in shoes)

how have designed interfaces changed how you filter out senses? (ex/audiovisual bias in processing, filter out body language)


can take in information without paying attention to it. (ex/ dancing as a follow, immersive gaming, using senses without consciously processing)

so, separate question from what information are you using: what are you paying attention to?

are you paying more or less attention? (more =the information is more abstract, forced to recall more context...less==we're not required to decide for ourselves what we should be processing...reading city traffic lights versus reading tracks in a forest, for example)


tech becoming more immersive but there's also a lot more of it. is tech leading to more immersion or more conscious attention?

abstract

what it means to get information from our environment how that's changing...

notion of what a 'sense' is has changed....


what does it mean in our daily lives to be used to looking at things like oven lights and computer panels

where designers have used awareness of how we process information to design the interface


differences b/w interfaces and environments

background, procedural versus declarative processing. procedural tool use, declarative 'about' things

  • interfaces designed to be easier to process, but sometimes harder because:
    • contain more out of context information...abstract no problem, brain abstracts all the time...but context important for how much work brain has to do. (ex/ walking, suddenly an owl out of context....vs. more aggressively decontextualized...surfing web, pop-up ad w no reference point).
    • also harder because more symbolic information processing such as language and maps, less directly referential information such as hearing strange noises and seeing animal tracks. (brain as simple database vs brain as interpreter)
  • at least in our culture, homogeneity is built into created object/experience, same every time (ex/starting up computer, going to the bank). procedural vs. (is it different w/ respect to senses going into autopilot cutting and pasting repetitively versus stripping bark from non-designed trees for a canoe?)

is there difference between canned info from designed environment/ information that machine picks up for you and your 'natural' senses? sometimes yes, and sometimes the difference is in how we think of it. between whether we think of it as an environment or an interface...ex/(consumerism, anthromorphization, social constructs) examples:

other differences

  • psychologically, when we think of it as receiving information from somebody, versus when we think of it as us figuring out what's going on around us...personification of information source ('RFID tag says it's a rose', vs impersonal 'smells like a rose'). do we trust information differently?
  • willingness to be interactive and gather sensory input...assume that we're being guided through interactions so all relevant input will be provided (ex/self-checkout machines @store, traffic signals versus watching the weather change)
  • expectations of user-friendliness and manageability of input:
    • what it means that there's too much. ex/what if the forest was a product of design? how would we rate it?
    • 'firehose' issues with information overload versus being in forest unable to process all information
  • personification. (of computer, of a boat you're steering) not sensing an object but interacting with a person, does this change senses? consumer attitude of being 'served' versus simple anthromorphization such as personifying lakes/rivers...
  • how we know which thing is important to pay attention to....are we getting more savvy as we create our own environments (game environment encouraging interactivity) or more hand-fed (when car broken, look to instrument panel readings rather than funny car smells)?

ex for what are we learning to focus on/designed feedback in how to catch a fish (pole as interface) versus no easy cues to teach you how to find good firewood (forest completely undesigned). contrast both with even more designed, game interface with life bar and "I've been hit" noises.

hypothetical...musings on future of tech:

  • smart interfaces: interaction taking place in physical world rather than from mouseclicks
    • ex/ tipping an iphone, moveable cubes from MIT, how will this style of interacting with technology change our senses?
    • since experience of both is, at some point, immersive (no longer paying attention to the 'how' of interaction), does it matter for our senses how much the interaction uses physical world, naturalistic modes of interaction
  • interacting w/ tech vs interacting with the world through tech. interactions where the interface is not substitute for world or simulation of world but lens through which you see the world (ex/ literally...retinal overlays, or practically speaking, using google maps application to geolocate)
  • can still double check data against what you see. but when do you check and when do you trust it and take it in automatically, as if it were sense? (ex/ helicopter pilots relying completely on instrument panels to know where 'up' is versus relying completely on vision that lets them see through floor, what is the practical difference?)
  • Does it matter for our senses how much the interaction is literally plugged in to our senses?

See Also