Article Preview
TopIntroduction
Space scientists are continuously interacting with numerical data. Interaction implies a response, which, in the case of space science, relies greatly on the perception and interpretation of patterns or isolated events in the data. This interaction with the data occurs during processing; it involves a process of visual perception and interpretation of the data, which in turn drives the direction of the next stage in the data analysis and extraction of meaning. It highlights challenges in effective data mining, data interaction, data perception, and data display. Nowadays, the interaction is limited by the numerical data analysis packages currently available and the resolution of currently available displays. Using traditional data analysis techniques that are based on visual representation of data, trends and patterns may be overlooked. As data sampling becomes faster and the data scale becomes larger, the need to assign meaning to numerical data acquired from the natural laboratory of the interstellar media evolves. With this new challenge, it is clear that there is a need to develop new data analysis techniques and tools, targeting user needs and the challenges of data mining and perusal.
Regardless, it seems that international efforts have been purposefully limited to the development of packages addressing visualization (visual display) of big data, without a recognition of the possibilities and benefits that accessibility and a multi-sense (more than one sense) approach can produce according to several perception studies (Díaz-Merced, 2013).
The telegraph led to great advances in telecommunications, mostly the transfer of a real voice signal from point ‘a’ to point ‘b’ in real-time. Those advances and the research on static and interference for better communication led to great scientific discoveries through audio perception. The use of multi-sensorial perception for data analysis has been documented in space sciences back to Preece (1894), and Barkhausen (1919), who used audio sensory modalities to study “atmospheric impulses” which as described in their papers “produces a faintly musical or chirping soundcalledtweeks”. Considering in-depth the origins (on telecommunication), of those discoveries, the Barkhausen (1919) and Preece (1894) times were a time where technological advances were starting to bloom. At that time in history, even some sensorial terminology was accepted to describe space science phenomena like whistler mode emissions, ringing, chirping, tweets, etc. With the development of new digital technologies, people with disabilities1 cannot participate in the field. This causes a large group of non-sighted people to be left behind.
Additionally, the digital era with a continuous competition on the development of digital interfaces neither require for developers and programmers to license, nor continuum education on User-Centred Design (UCD), Human Computing Experience (HCE), or Human Computing Interaction (HCI) for professional practice. Thus, human functioning, functional diversity, and the way people act are most often not taken into consideration when designing software. This may be a factor that led to the existence of developer interpreted accessibility and developed technologies normalized for one sensorial modality, perception, interaction and interpretation style, defining for the user what they will do and how. Leaving then a large population out of the loop.
On the other hand, objective analysis needs to know what we exactly look for, whereas the human brain is not limited in this respect. Instead of excluding functional diverse people, unproviding the adequate frameworks is time to enhance the actual data analysis display by providing the adequate tools allowing all people to succeed in their work. According to Hassan et al. (2013), the human brain is still by far the most powerful tool for the perusal of large data sets; and finally, it is human interpreting, recognizing, and deciding if there is anything of importance in the data.