Join 500,000+ Artists
Its' free and it takes less than 10 seconds!
everyone can think, everyone can draw.
ok, up until this point i've thought i've been living in a regular world and would never get to see something like this come out.
they are breaching the gap of interface and reaching into our brains useing non-invasive electroencephalography (EEG) to read the neural activity.
in short - theres a headset that you can use to interact with your computer.
how long do you think it will be before you can paint using thoughts directly from your brain, they allready have up down left right down packed, how long before preasure sensitivity and colour mapping.
the future is here i think , i cant wait to try one of these babys out in photoshop.
bbc story link
Last edited by alti; February 21st, 2008 at 07:50 PM.
dude, think of this as the stepping stone, remember computer art useing a keyboard, then a mouse then tablets, wacom, tiny wacom, to 21 inch wacom, then cintiq now massive 21 inch cintiq, next will be something like this i think, if you can direct the mouse with your mind anything is possible, the speed will get better, the sensitivity will increase..
crazy time to be alive
I think it will take a very long time before you will be able to actually make art with this. The thing is that the human mind is very chaotic. In a game you can force yourself to be occupied with a single thought. For example "forward forward forward forward forward" or "pickup pickup pickup pickup pickup pickup pickup". In a game world this can work. But when you are drawing or painting your thought process is not only occupied with strokes. You are also busy with narative, color theory, perspective, etc. So instead of a single thought you get a lot of random thoughts, and most of those thoughts the helmet wont understand.
There is actually already a game on the market using this technology. I played it 3 years ago, it was very fun. But the slighest chance in my close envirioment forced me to analyse what happened, resulting in a game over.
So I will remain a bit skeptical about it for now
"Master storytellers never explain. They do the hard, painfully creative thing-- they dramatize"
Anyway I'm very excited by that, and I think in the end you wont be interacting with the software like doing shortcuts and sending messages like "open color picker", "do shortcuts this and that", "hold alt", whatever... because that is a workflow based on a mouse and keyboard. I guess it will start like this because that's theworkflow we are familiar with.
But slowly it will go to something that is more intuitive like handling brush settings by mere feelings (and in the end I think the notion of brush will be obsolete, because that's based on the way the hand works, making linear movements), or like, you think about some sounds and it changes the color, and it actually open a whole world of possibilities about the way we use our brain... I think this kind of interface, when it will become convenient , will be a major revolution...and not only for video games and art.
Seriously...nothing good will come of this.
My work: [link]
Technological details aside, the functions that this device provides is rather limited.
In short, moving a cursor accurately enough for drawing with any efficiency on a display, with your thoughts alone, is not something this gadget can muster.
I think this topic has been brought up before. Even if this device allow the user to draw accurately with their thoughts, it still does not make any average person Leonardo Davinci.
Removing the hand from the hand, eye and brain coordination equation would not make drawing much easier.
Artistic creation is not a purely mechanical action and the artist's thoughts must be well composed.
"Master storytellers never explain. They do the hard, painfully creative thing-- they dramatize"
i think its a step into an awsome new world, seeing further into the mind (the most complex computer on earth ) is a wonderfull thing.
even if this is the most basic of foundations, up down, hot cold, left right, happy sad, i'd still love to see what this leads to. As the devices become more accurate, i think the launguage of design will change much the way m@ was saying.
i think we're hundereds of years from releasing the minds eye into a pure representation, but these are all the right steps to get to that level.
i'd love to map some shortcuts and see how it goes with this model of device.
When you can paint with these things, people will probably drop the argument on "is digital art considered cheating?" and start picking on the newest innovation !
ok i did a little bit more snooping around.
Here's how it works: The software has several choices for actions you can take. So, taking the disappearing cube as an example, once you're hooked up to the headset, you're directed to run a short, six-second test, where you concentrate on doing something, anything, with your mind--relax, focus, whatever.
Then, once you've completed the test, it's you against the cube. And the challenge is to see if you can reproduce what it was you were doing with your mind during the test; If so, the cube slowly disappears.
In my case, it disappeared, then came back, then disappeared again and then came back. Repeat.
They also ran me through another example, this time trying to pull the cube forward. This one was harder because the brain function I chose to do to synchronize with the challenge was more concentrated. It involved me sort of tensing up my head and imagining the act of pulling the cube forward. It didn't work very well.
But with the disappearing act, I simply relaxed my mind, with much better results.
Of course, there's no relationship at all between brain activity that is consciously trying to "pull" the cube forward and what happens. That is to say, it doesn't matter in any way what you're doing with your mind, so long as what you do during the six-second calibration matches what you do when you try to enact the action.
So really, the software is just looking for a pattern match. It's not all that complicated a concept, though I'm sure it's a pretty difficult engineering feat.
so you think of something, ie the word ERAZOR get it nice and clear in your mind,map that pattern to a shortcut- let the machine map that pattern. when you want the erazor, you think ERAZOR and with practice there you go.
creat some photoshop macros for colouring or selecting, who know what you could do.
sounds like quite an interesting development. though i think there is a fundemental flaw in how this all works (at least so far as for the purpose of painting). The flaw is that it is requiring you to think MORE, not less. Its like building a complicated series of levers and pulleys attached to your paintbrush so you can paint an oil painting while lying in bed. If im trying to bust an idea out of my head i dont want all my concentration dedicated to some 'erazor mantra' in my head Should it eventually become as intuitive as using a keyboard (not before 2012 suckaz!), it would still require a schism of your thought, one part controlling the software and the other working with your painting ideas.
Further to this, such an application would force our minds to spend EVEN MORE time in the neo cortex. The rate of information tyransfer when sitting at a computer (and increasingly so in life in general) triggers responses in the 'new' brain responsible for analytical thinking, sorting, etc. By localising control of our painting to this neocortex we are even further compounding this phenomenon. What this does is take us out of the reptilian cortex and other parts of the brain responsible for emotion and intuition, which are already underused. Whats great about traditional painting is that it engages you physically as well as mentally, forcing more parts of your brain to light up. This brings you out of the neocortex and stimulates more of the brain, leading to a heightened sense of creativity and satisfaction. This is not to say that the neocortex is BAD. Its just that the brain works ideally when it is in balance, and a device like this I feel would isolate and overwork the neocortex to the point where the brain would struggle to intuit and express the subconscious in our painting. I hope that made vague sense.
On the upside, this device would definately make us engage with our minds more, encourage internalisation and investigation of how our own minds work. Creating an avenue for the everyday man to do this meditation on the working of their own mind, i believe, is the next big step needed to ensure the evolution of mankind.
I say bring it on, lets see what this thing can do
oh, and one more thing - since it is pattern-recognition based, that fundementally necessitates the user to premeditate an action. This reduces it to the functionality of any other input device. In the same way as you cant add a key to your keyboard on the fly, nor will you be able to do magical things with this headset to get the image you see in your head onto the canvas, because the image you see in your head isnt a pattern it recognises.
wouldnt clicking a link on your faves bar be a lot easier than strapping on your retarded looking headset and repeating a mantra?? loL!
alrighty thats a no-go for me....and does require gel on the scalp
Seriously, if this device can be accurate enough, I think there can be several very cool applications for it, however, I don't think drawing will be one of them. You obviously can't just think 'draw boobs', and the only way you could actually draw with this is using up, down, left, right commands, and until that can be controlled in a continous, flowy way, and not just up one pixel, right one pixel, it would be impossible to do anything woth the effort. I think?
Now if they would just come out with a scanner like this. Just read your thoughts on an idea, and poof, out comes the concept. But then again, there would be no need for us anymore then.
FUCK YOU YA PRETENTIOUS DICKS!! BAN ME!!
this thing would be pretty basic. EEG is just the electrical activity measured from brain activity and doesnt reflect specific thoughts as far as we know. maybe it would work in a binary 'DO SOMETHING' or "im not thinking hard" way - because it relies on you changing your brainwaves in an entirely consistent and repeatable way
edit: the pull push drop etc. stuff would be controlled by the gyroscope (ie tilt your head this way or that) and the facial expressions would be from EMG which is muscle activity and not brain activity. it probably just uses the EEG for the emotions that it mentioned
Last edited by lumar; February 24th, 2008 at 05:17 AM.
hahah first thing that came to mind when reading through this thread is that when people start using this, we'll see a lot of making faces in front of the computer Imagine yourself really hard focussing on a single mantra, your face all screwed up in concentration, and then imagine a lot of people doing so behind their screen...
Apart from that, I saw a lot of people hoping this thing could actually read the mind in terms of up, down etc. If you search for Langton's Ant on the internet you'll find a nice simplistic representation of how the human brain works. Modern science has come up with a model of how our synapsis connect with each other and how they transfer information, but that gets us no further in actually understanding what really goes on in our minds.
Still, it'd be a nifty thing to achieve one day.