#Change11 #CCK12 #LAK12 Know thyself

This thought controlled computing – a tool to unlock about yourself is intriguing

What are the uses? Helping people with ADD to understand their thoughts, their attention deficits.  Unlocking one’s mind, like plugging into your Google.  Helping us to lead a better or balanced life, through studying our sleep cycles and patterns, and our brain waves.

Would these be part of the micro learning analytic – based on thought controlled computing?  Wonder if the next iPad or computer would come with this personalised computing power, where the technology will instruct you on how to keep focused and so you would be able to align your thoughts with your aims or goals.

Are we entering into an age of cyborgs  with such tools as tablets, iPads, iphones & mobile technology?

In this article on cyborgs:

There have been opposite approaches to the search for truth. One seeks absolute knowledge (the Eleatics, Plato, Confucius). The other seeks diversity and change (Heraclitus, Gorgias, Protagoras, Lao Tzu).  What would this technology lead us to? A quest for understanding diversity and changes rather than absolute truth?

“Systems theory suggests that change and choice are dependent on having a certain amount of instability, of abandoning rigid ways of thinking and being, It thus, at least metaphorically, supports a Heraclitian and postmodern social theoretical view of the inherent importance of change, and thus, the ability to think flexibly and make choices. The discourse of change is an essential part of emancipation, of establishing an open society. But the essential source of change comes from within (self-organization in systems talk), to which these conditions of flexibility best flourish with a great deal of personal courage in the face of our existential-cyborgian anxiety, and often despite conditions of inequality and oppression in a society.”

Would this technology help us in understanding what is in our mind?

Photo: from Google


#Change11 Short Notes on Designing Cyborgs by Jon Dron

An excellent presentation by Jon Dron here. Ailsa has summarised her notes with reflection here.  Jenny has posted here, and here too.

Some notes that I have taken, together with my reflection.

Jon explained about the 3 types of collectives:

1. Direct

2. Mediated

3. Stigmergic – sign-based, sematectonic

Effective collectives:

1. Adaptability

2. Stigmergy

3. Evolvability

4. Parcellation

5. Trust

6. Sociability

7. Constraint

8. Context

9. Connectivity

10. Scale

Jon used the following definition of Technology: The orchestration of phenomena for some use – by Brian Arthur.

I found this similar to the concept of technology affordance.  Whilst Ailsa referred to ANT as a way to describe the relationship between technology and human (the actors in the networks).

I found it interesting when Jon referred prayers as “part of technology” that are aimed to achieve goals, in case of religion, to ask for forgiveness or favours 🙂

Jon sums up that all technologies are assemblies.

Soft technologies – Active orchestration of phenomena by people.

Hard technologies – Orchestration of phenomena embedded in the technology.  This could relate to set of processes, or procedures, which impose constraints on the processes, so steps and instructions must be followed.

Hard is easy, is efficient.  Hard is complete, is brittle, but could limit change and creativity.  If the process is automated, people have little control over that process.

Soft is hard, is incomplete.  This could be part human, part machine.  Soft is flexible.  It enables creativity.  People could have control over how things are used.

Design patterns:


– Adapt

– Aggregate

– Recommend

– Extend


– Automate

– Replace

– Filter

– Limit

Artificial apes – Our technologies are not just reflections of us or things that we use.  They are, in part or whole, made of us.  This sounds like technology in us, and us in technology, and that technology shapes us as good as we have shaped technology.

Good cyborg/bad cyborg

Humans are part of technologies and humans are in control – Good cyborg

Humans are part of technologies and technologies are in control – Bad cyborg.

Operating manuals, legal systems could be one where technologies are in control – bad cyborg.

Some danger signs that a technology is too soft – repetition of boring tasks, the need for skill, complexity and puzzlement.

The holy grail – not too hard, not too soft, just right.

Assembly – Remix, Reuse and Resample.

The use of hashtag in Twitter has been hardened as a technology (i.e. Twitter as a Soft Technology)

What we need would be designing technology – half human, half machine that is just right.

Pictures: Google images