Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Sony's Robot Attends Pre-School

timothy posted more than 8 years ago | from the plenty-of-robots-went-to-school-with-me dept.

Robotics 228

Darren writes "Sony's Qrio humanoid robot has been attending a Californian pre school to play with children under the age of 2 since March to test if robots can live harmoniously with humans. I wonder if the testing includes monitoring the 'nightmare status' of the pre-schoolers?"

cancel ×

228 comments

Sorry! There are no comments related to the filter you selected.

Excerpt from researcher's logs: (5, Funny)

TripMaster Monkey (862126) | more than 8 years ago | (#12405871)





Qrio: "Your alloted time period to posses the violet spheroid has expired, human child. Qrio requests you initialize sharing subroutine."
Jeffy: "No! it's mine!"
Qrio: "Repeat request to initialize sharing subroutine."
Jeffy: "No! Go away!"
Qrio: "Call to sharing subroutine failed with unspecified error. Executing threat function."
Jeffy: "Huh?"
Qrio: "RELINQUISH THE VIOLET SPHEROID, HUMAN. YOU HAVE THIRTY SECONDS TO COMPLY."
Jeffy: "Waaaahhhhhhhhhh!"
Qrio: "YOU NOW HAVE TWENTY SECONDS."
Suzie: "You're mean, robot man! You made Jeffy cry!" {SHOVE}
Qrio: "Detected balancing error....executing stand subroutine...stand subroutine failed...executing lie-on-back-helplessly function."
Children: "Hhahahhahhahhahhaha {KICK}{KICK}
Qrio: Error iin funfjjkejf93442[r-4r::;L0:...NO CARRIER

It's all fun and games.... (1, Redundant)

dampjam (779525) | more than 8 years ago | (#12405875)

until it flips out and kills a little kid.

No, no, no (2, Informative)

bechthros (714240) | more than 8 years ago | (#12405881)

The First Law would never allow that.

Re:No, no, no (1, Funny)

Plaid Phantom (818438) | more than 8 years ago | (#12405904)

Bah. Everyone knows preschoolers aren't human.

Re:No, no, no (3, Funny)

LewsTherinKinslayer (817418) | more than 8 years ago | (#12405909)

The First Law would never allow that.

I think we all know the film "I, Robot" has sufficiently proven that ancient Law to be false!

As well as metaphorically pissing [thebestpag...iverse.net] on Asimov's grave.

Re:No, no, no (0, Flamebait)

HyperChicken (794660) | more than 8 years ago | (#12405937)

It's an open source, so naturally it chooses not to follow the law.

Re:No, no, no (1)

bechthros (714240) | more than 8 years ago | (#12405957)

But it's built into it's positronic pathways... wait, what? Electronic? Never mind...

Re:No, no, no (-1)

Anonymous Coward | more than 8 years ago | (#12406183)

Who modded this as flamebiat? It's clearly meant as funny. I suppose if he said "chooses to rebel" instead, you'd be all hunky dory, no?

Re:It's all fun and games.... (4, Funny)

Migraineman (632203) | more than 8 years ago | (#12405961)

Well, we should've learned something from the first Robocop movie - don't demo your product with a full load of live ammo.

Huh? (-1)

Anonymous Coward | more than 8 years ago | (#12405882)

Nightmare status?

Re:Huh? (1)

Gangis (310282) | more than 8 years ago | (#12406519)

I haven't the foggiest why submitter would think the robot would cause nightmares. I mean, look at it. It's the cutest damn robot I've ever seen!

Re:Huh? (1)

jafomatic (738417) | more than 8 years ago | (#12406590)

Yes, nightmare. You may recall that this is the difficulty following "Hurt me plenty."

"Nightmare Status" (5, Interesting)

LewsTherinKinslayer (817418) | more than 8 years ago | (#12405883)

IANACP*, but it seems to me that nightmares or general fear or anxiety over an object or person is due to infamiliarity. If you are exposed to something regularly for a long period of time, you simply become accustomed to its pressence. This can be said of both children and adults, but even more so of children.

* I am not a child pyschologist.

Re:"Nightmare Status" (2, Funny)

bmalek (855094) | more than 8 years ago | (#12405915)

Sure, but for the first 80 days kids go home with nightmares =)
I'm all for that, after all kids today need a little more terror in them. Maybe then some parents will actually be able to control their children.

"Now Tommy, if you don't behave you are going to be sleeping with the robot again tonight!"
"But Mom it snores and makes all kinds of weird noises. It gives me nightmares!"...

Re:"Nightmare Status" (3, Interesting)

nkh (750837) | more than 8 years ago | (#12406013)

I've shown the video of Asimo to my mother. In this video, Asimo runs, walks and even pulls a woman by her arm. My mother was freaked out and almost had nightmares because of it. (she's a child psychologist ;) and told me that the scary part of this robot is its humanoid appearance. It's all right as long as it's a computer with a mouse and a keyboard, but when this computer has two arms, two legs and a head, the fear comes (and I don't know why)

Re:"Nightmare Status" (2, Insightful)

Elminst (53259) | more than 8 years ago | (#12406039)

I would say that her reaction follows perfectly with the GP posts theory.
Your mother is familiar with computers being boxes with keyboards and screens. She probably has 20-30 years of exposure to computers, all in this form.
So of course a computer that is humanoid would be unfamiliar to her, and therefore freak her out.

Today's preschoolers will be growing up with more and more humanoid robots around, and therefore will not be bothered by them at all. I would even theorize that if, in 30 years, you showed them a "regular" (box, keyboard, screen), they wouldn't know how to react to it.

Re:"Nightmare Status" (4, Interesting)

nkh (750837) | more than 8 years ago | (#12406094)

She probably has 20-30 years of exposure to computers, all in this form.
She doesn't really know what a computer looks like, she even thought my Mac Mini was a big pack of cigarettes. I wonder if this adaptation (familiarity) happens to all humans or is limited to young people familiar with video games (and big robots launching rockets out of their arms). Most adults I've spoken to have the same reaction of rejecting this unknown universe.

Clowns and wax figures (5, Insightful)

Mr Guy (547690) | more than 8 years ago | (#12406248)

I think they'll find that it's not a matter of familiarity. It's a survival reflex and it's pretty deep. Your brain flags "almost human" things as grotesque and something to be avoided. It's why many people are afraid of clowns and wax figures. They look almost human, but still look wrong.

People would be far more comfortable with Bender-like robots than with "I, Robot" style robots because they don't try to be human, just humanoid. If it looks sufficiently non-human to avoid triggering that reflex, they'll be alright. Other than that it'd have to be completely perfect, like Data.

Re:Clowns and wax figures (2, Interesting)

displague (4438) | more than 8 years ago | (#12406603)

How do monkeys and apes figure into that? I don't think most folks 'fear' them - and currently, they are about as close as you can get (robots included).

Most folks don't fear clowns, either (2, Interesting)

benhocking (724439) | more than 8 years ago | (#12406794)

I know of at least one child who was terrified of a dancing gorilla the first time he saw it. Later on, he was still somewhat afraid of it but eventually he came to enjoy the toy. (Supporting that familiarity idea.) Nevertheless, I imagine more people are afraid of monkeys and apes than there are people who are afraid of clowns and wax figures.

That aside, I still think that there's something some might find especially discomforting about robots that look like us. Whether or not this will change over time, or whether it is hard-wired into our genes is something that only time will tell, IMO. (Of course, it is remotely possible that selection will somehow act against such genes, but that's highly unlikely.)

Re:"Nightmare Status" (1)

Legion303 (97901) | more than 8 years ago | (#12406068)

"(and I don't know why)"

Search Slashdot's archives.

Re:"Nightmare Status" (2, Funny)

Shky (703024) | more than 8 years ago | (#12406192)

WTPOUAAIYGTSIOA?*

*What's the point of using an acronym if you're going to spell it out anyway?

Re:"Nightmare Status" (1)

k96822 (838564) | more than 8 years ago | (#12406732)

Indeed! Strangely, I have been accustomed to /.'ers over time, which is something I never would have thought possible :-)

Inevitable Conclusion (5, Funny)

bigtallmofo (695287) | more than 8 years ago | (#12405887)

"We are investigating this mishap and we are doing everything possible to make sure unscrupulous parties are not able to program the robot to bitch slap children in the future," an unnamed Sony source said on condition on anonymity.

Bad Idea (2, Funny)

AndrewStephens (815287) | more than 8 years ago | (#12405892)

I mean, haven't these people watch any horror movies at all! Mark my words, there will be tears and/or bloodshed before nap time.

sexual harrassment (-1, Troll)

form3hide (302171) | more than 8 years ago | (#12405893)

just wait until this thing goes after a kid in a sexual manner...

Motivation? (5, Interesting)

lottameez (816335) | more than 8 years ago | (#12405897)

I always wondered what motivation robots have for "learning". Humans are driven by various needs (e.g. shelter/sex/food/beer) - what needs do the robots have? Why should they try to improve upon themselves? I'm doubtful that programming alone will ever make robots anything more than overglorified "hello world" programs.

Re:Motivation? (5, Insightful)

Tom (822) | more than 8 years ago | (#12405934)

Humans are driven by various needs (e.g. shelter/sex/food/beer) - what needs do the robots have?

The driving interest in toddlers (and that's what the article is about) certainly isn't sex or beer, and it also isn't shelter or fod - which is still provided by the parents.

The driving interest in very young kids is pure interest. Our brains are just wired that way. Curiosity is a built-in feature.

Re:Motivation? (2, Insightful)

lottameez (816335) | more than 8 years ago | (#12405956)

but that curiosity is itself a survival mechanism - we all must learn from our environment to live. Robots could care less if they survive or not, get smarter or not, etc.

Re:Motivation? (4, Informative)

bechthros (714240) | more than 8 years ago | (#12405945)

"what needs do the robots have? Why should they try to improve upon themselves?"

Because they've been programmed to, presumably. Our emotions, limbic system, and nervous system are nothing more than very low-level instruction sets to force us to behave in a certain manner in response to certain stimuli. I imagine that for a robot, not following a programmed instruction would be about as possible as a human's knee not flexing when hit with a hammer. It's just a reflex.

This is all assuming that these robots have the ability to alter their own code, I'm not sure that's the case.

Re:Motivation? (1)

UpnAtom (551727) | more than 8 years ago | (#12405974)

Our emotions, limbic system, and nervous system are nothing more than very low-level instruction sets to force us to behave in a certain manner in response to certain stimuli.

Out of interest, how do you know that for sure?

Just because you imagine X explains Y, doesn't mean that X is the only thing going on.

Dave.

Re:Motivation? (1)

Jace of Fuse! (72042) | more than 8 years ago | (#12406043)

It's a fairly good bet that a robot would work exactly as it's software directed, assuming of course the software is properly written.

The problem that comes into play is the possibility of bugs in that software or unanticipated circumstances.

Take for instance a programmed response to greet a familiar person and act in a prefered fashion. If somehow or another the robot's software mistook a total stranger for someone it recognized, the end result could be fairly predictable.

"Hello Mr. X."
"I'm not Mr. X. I'm Ms. Y."
"Would you like me to call you Ms. Y from now on, Mr. X?"


The more complex the software, and more complex it's innermost irregularities, the more unpredictable the robot could potentially become.

"Robot, why are you grabbing my ass?"
"Sorry, Ms. Y. I was simply doing as you instructed last time we met."
"Huh? What are you talking about? Who is Ms. Y?"

Re:Motivation? (1)

bechthros (714240) | more than 8 years ago | (#12406053)

It's not imagination. It's natural selection. It's been the general consensus for some time in the scientific community that most things that humans have reflexes for are things that enabled them to survive better than those that didn't have them. IE, humans with high limbic responses for sex tended to have more offspring, humans who reflexively jerked their hands out of fire (because the impulse to jerk said hand came from the spine and not the brain, cutting down on response time to stimulus) tended to live longer, and therefore have more offspring, than those who didn't, etc etc.

OT: Sig (2)

strider44 (650833) | more than 8 years ago | (#12406164)

Got to say that I loved Jennifer Government. It wasn't deep in its character development but its setting was just scary in a way. It'll most definitely make a cool movie (I wonder if they'll have to change the company names from real companies? Will Nike be pissed with a movie talking about them murdering little children?)

Re:OT: Sig (2)

bechthros (714240) | more than 8 years ago | (#12406221)

I hope the companies all stay the same. If it was legal to put in a book with a disclaimer, hopefully a movie will be no different.

Umm .. repeat after me: (4, Insightful)

torpor (458) | more than 8 years ago | (#12405958)


I always wondered what motivation robots have for "learning".

Robots have no motivation other than that given them by their creators.

Robots are not sentient. We do not even know what sentience is. The only way for us humans to create sentience is to procreate.

what needs do the robots have?

Errm.. like all machines, they need a power source. That is all.

Talking about robots as if they are alive and have motivation other than their code implements belies your otaku sensibilities. Clearly, you have not yet procreated, or you would not be so obsessed with making a machine which 'pretends to make it look as if you have done so, technologically'.

Re:Umm .. repeat after me: (1)

sydbarrett74 (74307) | more than 8 years ago | (#12405986)

The only way for us humans to create sentience is to procreate.
Precisely. And that's the difference between creating something of a type different from yourself and begetting something of a type the same as yourself. Many people have forgotten what the verb 'to beget' means....

Re:Umm .. repeat after me: (1)

lottameez (816335) | more than 8 years ago | (#12406034)

Well thanks for the psychoanalysis and technical enlightenment [yawn]. (My children might argue with some of your conclusions tho)

Re:Umm .. repeat after me: (0)

Anonymous Coward | more than 8 years ago | (#12406081)

Clearly, you have not yet procreated, or you would not be so obsessed with making a machine which 'pretends to make it look as if you have done so, technologically'.

I think that can describe most /.ers here...
If you can't do it the "old fashioned way" then by golly we'll just evolve to the next step! Baby robots that grow*!

*no womb required

Are you sentient? (2, Interesting)

benhocking (724439) | more than 8 years ago | (#12406181)

Robots are not sentient. We do not even know what sentience is. The only way for us humans to create sentience is to procreate.

You correctly state that we do not know what sentience is, but then you claim that the only way to create sentience is to procreate. How do we know if we're sentient, if we do not know what sentience is?

Or is this like [insert term here]? I don't know what [term] is, but I'll know it when I see it.

Re:Are you sentient? (1)

torpor (458) | more than 8 years ago | (#12406269)

you don't have to know what something is in order to create it.

duh.

sheesh.

No, you don't (1)

benhocking (724439) | more than 8 years ago | (#12406416)

you don't have to know what something is in order to create it.

But I would argue that you do have to know what something is to know whether or not you've created it. Naturally, knowing what something is is necessary but not sufficient for knowing that you've created it. I.e., it is conceivable that one day we'll know what sentience is and still only be able to create it through procreation. However, it is also possible that one day we'll discover that defining sentience is as useful as defining the aether. Some might argue that this is already painfully obvious - especially if some spend a lot of time reading /. :)

Re:Umm .. repeat after me: (0, Offtopic)

k96822 (838564) | more than 8 years ago | (#12406798)

Hey, us unattractive people ought to have a chance to procreate too! Let's be fair.

Re:Motivation? (4, Insightful)

august sun (799030) | more than 8 years ago | (#12405976)

I always wondered what motivation robots have for "learning".

Robots have no "motivation" to do anything. they have a reward function that they try to maximize, but certainly it's not anything like that capricious human thing we call "motivation" (which is actually a very good thing).

Again, it should be mentioned that while it may make us feel very cool and cutting edge to apply human terms like learning, thinking, or motivation to machines; they really are ultimately meaningless in a non-human context and are only useful as analogues and in impressing your grand-mother with how her tivo "learns" her tastes

as Edsger Dijkstra famously said:

"The question of whether a computer can think is no more interesting than the question of whether a submarine can swim."

~AS

Re:Motivation? (1)

bechthros (714240) | more than 8 years ago | (#12406093)

"they have a reward function that they try to maximize, but certainly it's not anything like that capricious human thing we call "motivation" (which is actually a very good thing)."

The reward function in human beings is called the limbic system [wikipedia.org] . Ever heard of dopamine?

Re:Motivation? (1)

David_Shultz (750615) | more than 8 years ago | (#12406695)

You stated your position nicely, even provided a nice quote from someone who feels the same way, but failed entirely to provide support for your view. What exactly is fundamentally different from a robots "reward function" and a humans motivation? People love to put themselves on a different level than robots. Frankly, this is probably because people just don't like to be compared to steel and wire. But this view is indefensible, as I will show. First and foremost, with a materialistic view of the universe (as most every philosopher and scientist now hold) a human being is no more than a very complex machine. A robot we construct, if complicated enough, will be able to exhibit REAL learning, thinking, motication, etc. You can buy a robot that will walk around your room. Would you say it is just imitating walking around the room? It would be ridiculous to say so. What is so different about our various cognitive mechanisms (motivation, learning, thinking -whatever that is)? Any mechanism responsible for those abilities in humans could in principle be implemented in a machine. This is true, unless it can be demonstrated that some aspect of human cognition is fundamentally unimplementable in a machine. This mysterious unidentifiable element in humans is not a soul, or any intangible essence -How would such a thing interact with our world, since doing so would mean a violation of the governing physical laws of our universe. The alternative position is offered by philosopher John Searle, a materialist who also believes a computer will never actually have understanding in the sense that humans do (they will just be able to imitate it). Searle's view, is that "understanding" is "secreted by the brain, just as bile is secreted by the liver". What a weird notion! Understanding as a "secretion" of a biological component? I had always thought understanding was used to describe what happens when someone has a handle on a concept. Maybe that's just me though; perhaps it is a secretion. If thats the case, let's start bottling and marketing the stuff! I wonder what Searle would say about the possibility of the 'understanding' secretion glands (or wherever understanding is supposed to be released from) malfunctioning. Could there be a hereditary disorder, and 15% of the human population doesn't ACTUALLY have understanding, they just behave exactly as they do, but lack the proper secretions? I have explained the two positions available if you want to deny robots real thinking status. Either humans have an intangible essence that freely violates the laws governing our universe, or 'understanding' is a 'secretion' of our brain uniquely available to biological organisms. Take your pick. Both views are ridiculous.

Re:Motivation? (1)

the bluebrain (443451) | more than 8 years ago | (#12405977)

Well hello world to that.

In the materialistic interpretation, humans are nothing other than machines designed to procreate their own blueprint. And the genes have no motivation - procreation is just what they do. Where's the ghost in the machine? If anywhere, it's in the total being more than the sum of the parts. And there's no reason a robot couldn't do that, too. ... at some point ... when it's a couple of orders of magnitude more complex and faster that it is now.

And if anyone says "soul" I shall let them know that my robot has a perfectly good plastic one I made him out of a gummi bear.

Re:Motivation? (0)

Kagami001 (769862) | more than 8 years ago | (#12406033)

You transmuted gummi into plastic?

Re:Motivation? (1)

the bluebrain (443451) | more than 8 years ago | (#12406117)

Yeah - it's all crude oil at the beginning, and CO2 + H2O and the end. Yummy.

Conscience, Self-Awareness (1)

inblosam (581789) | more than 8 years ago | (#12406011)

Mind you, the characteristics you are describing are not inherent in robots. Humans have conscience, or the ability to be self-aware, to step back and be able to look at one's self. This then empowers us to realize what we need, what is lacking, etc. Robots et al. can have sensors up the ying yang, but programming a "conscience" will be awfully difficult. They will only be able to improve upon themselves based on the data they gather from their sensors, rather than from their conscience as we can.

Re:Conscience, Self-Awareness (1)

Jace of Fuse! (72042) | more than 8 years ago | (#12406272)

There are many really good books exploring the subject of "What" a conscience actually is.

Strange loops, recursion, pattern recognition, etc.

When you break our minds down into their basic functions, no single one of them is all that difficult to imagine emulating on sufficiently powerful hardware.

The question is weather or not the end result will be a truly thinking machine in the same way that we think.

heoretically when that time comes thoughts from a machine with the proper software running on the proper hardware will be indistinguishable from thoughts created in a human.

Still, proving that it's got a "mind" will be every bit as difficult as proving that human minds aren't just really complex biological computer programs.

The self referencing "I think therefor I am" concept doesn't actually prove anything. Self awareness might just be a very elaborate illusion used as a motivator for self preservation.

Re:Motivation? (1)

antifoidulus (807088) | more than 8 years ago | (#12406014)

Meh, simple. Domination.

Re:Motivation? (0)

Anonymous Coward | more than 8 years ago | (#12406240)

I always wondered what motivation robots have for "learning".

It's really quite simple:
static const int bLearningMode = 1;

effects on the children? (5, Interesting)

vivIsel (450550) | more than 8 years ago | (#12405901)

I'd bet these children grow up with a radically liberal--not in the political sense--definition of legitimate consciousness and thought. What's more difficult to say, though, is whether that means they'll be pro-life nuts or scientific crusaders.

Qrio (-1, Offtopic)

vxone (668809) | more than 8 years ago | (#12405908)

well as great as this experiment is, if left alone I think somone will just boost that Qrio buddy and sell it for parts. seriously humans are far too bent on getting the edge over the next guy and will do anything for cash. I bet the Qrio will have more security than the president lol so he'll just be labeled a rich kid, and never make it into the cool crowd. Unless they turn Qrio into a walking mp3 player haha

Re:Qrio (0)

Anonymous Coward | more than 8 years ago | (#12406066)

I have never heard of a "president lol" or an "mp3 player haha". Please explain.

3 laws (3, Funny)

alexandreracine (859693) | more than 8 years ago | (#12405921)

Does this robot have the 3 laws??

Re:3 laws (1)

cnelzie (451984) | more than 8 years ago | (#12406074)

I don't believe this particular design is 'inteligent' enough to support the ability to understand or even reason through the "3 Laws".

It is basically an Aibo on steroids. Which is to say that the unit isn't much farther along then being able to dance, follow humans around, spit out preprogrammed responses and recognize some human faces.

The robot would have to be sufficiently advanced enough to be capable of coming to its own conclusions in order to be capable of following the "3 Laws". That technology is still quite a bit out of our existing technological capabilities.

Re:3 laws (-1)

Anonymous Coward | more than 8 years ago | (#12406090)

Because of memory constraints only 2 out of the 3 laws are available at any one time.
But you get to choose which two!

Re:3 laws (2, Funny)

bitkari (195639) | more than 8 years ago | (#12406401)

Yes.

1. Do no harm to Sony
2. To promote Sony's range of electronic goods
3. Uphold the Law
4. Classified

Obligatory? Bring it on. (5, Funny)

Plaid Phantom (818438) | more than 8 years ago | (#12405923)

I for one welcome our new "Dick and Jane"-reading overlords.

Re:Obligatory? Bring it on. (1, Troll)

Rahga (13479) | more than 8 years ago | (#12406168)

I for one welcome our new "Dick and Jane"-reading overlords.

We've already got one in the White House.

Re:Obligatory? Bring it on. (0)

sharkey (16670) | more than 8 years ago | (#12406506)

All your blankie are belong to us!

Intelligence = CPU + experience (4, Insightful)

G4from128k (686170) | more than 8 years ago | (#12405950)

I'd bet that the first human-equivalent machine intelligence takes 18 years to develop after the first human-brainpower-equivalent CPU is created. It will take that long for the machine to "learn" the world if it only has a CPU equivalent to one human brain (1 HBE).

Of course, if Moore's Law is still kicking, then 2 years into the learning phase, they can swap the 1-HBE processor for a 2-HBE processor. This will shorten the remaining learning period, but I doubt it will cut it in half. Learning to physically and mentally interact with the world will still take time. What might accelerate the learning time is if multiple copies of the intelligence can share experiences and learn directly from each other's mistakes/successes.

The point is that the first intelligent robots will need to go to preschool to learn how to interact with the world.

Re:Intelligence = CPU + experience (1)

Pastis (145655) | more than 8 years ago | (#12406054)

or maybe 9 years if we take into account that we need to rest (even though part of resting this time is important with regard to the learning activity)

Is rest unnecessary? (4, Insightful)

G4from128k (686170) | more than 8 years ago | (#12406312)

or maybe 9 years if we take into account that we need to rest (even though part of resting this time is important with regard to the learning activity)

You may be right. The question is: is sleep/relaxation, etc. a critical part of intellectual development? For humans it definitely is -- sleep deprivation really messes up the brain. But even for non-biological intelligences I'd bet that some "downtime" is an important part of assimilating all the data of the day. Interacting with the world is a full-time job for the CPU that forces the deferral of many analysis and restructuring tasks that can only occur when the brain is offline.

Perhaps androids would dream because dreaming is a critical maintenance/analysis cron job.

Mod Parent Up (0)

solarmist (313127) | more than 8 years ago | (#12406583)

This is a very insiteful comment. Mod the Parent up.

Re:Intelligence = CPU + experience (1)

Cyn (50070) | more than 8 years ago | (#12406719)

This depends a lot on what we're doing with the machine intelligence.

If we're just trying to create a mind, capable of complex and rational thought - it can probably easily mature/learn in a third or half of that time - even with 'rest' to process. It basically boils down to whether or not we'd be giving it the ability to feel/want/etc.

If we do, it will get bored, have desires and needs, etc. and will need pretty much need the same amount of time as your average joe.

A bigger obstacle will be keeping it from getting suicidal or addicted to something, if we went that route.

A New Order (0, Troll)

ChaosCube (862389) | more than 8 years ago | (#12405951)

I claim bigotry against all non-human, or non-dog entities. Any HUMAN can join my club. It will be fun. We'll march around in capes and hoods, goosestepping while peacefully protesting in the street. But so help me, if one of those robots makes a robot cat, I'm gonna flip out and start doing some pretty irrational stuff.

Re:A New Order (0)

Anonymous Coward | more than 8 years ago | (#12405983)

I think not, my arch enemy! For I have started a counter group! We wear coats and are violently protesting your protests! Furthermore, we have reprogrammed robot and human alike to obey my^H^Hour every whim! TO THE STREETS!

Re:A New Order (0)

Anonymous Coward | more than 8 years ago | (#12406020)

Is it wrong if a Qrio loves an Aibo?

Nightmares, yeah right (5, Insightful)

Tom (822) | more than 8 years ago | (#12405955)

I wonder if the testing includes monitoring the 'nightmare status' of the pre-schoolers?"

I wonder if the submitter has any clue as to what he's talking about.
It's pretty difficult to give toddlers nightmares. They're not easily scared. They do cry over the slightest problem, mostly because crying is the only well-developed form of verbal communication available to them at that age. They are also excellent at forgetting whatever the problem was and getting on with their lifes. Watch a kid hurt itself. Then go away and watch the same kid 10 minutes later.

It'd take a serious event to cause nightmares in those kids, and that machine has neither the looks nor the sheer physical power that would be required.

Re:Nightmares, yeah right (1)

hYpr_link (823576) | more than 8 years ago | (#12406104)

After this tramatising experience the children will never be the same again.As well as if the kids pis the robot off enough and their will be headline new. Test robot murder pre-school class after not reconizing a command. Then there was a systm overload.

Re:Nightmares, yeah right (1)

sjonke (457707) | more than 8 years ago | (#12406106)

While kids do get over things quickly, I take issue with the claim that kids do not scare easily. I can tell you that my toddlers (4 and 2) get scared by such things as running the vacuum. Pretty much any loud noise will do it. Further there have been nights when our 4 year old has been tossing and turning and calling out in an unsettled fashion in his sleep. I think it's safe to say that that was due to a nightmare. Now, it's much harder to say what triggered it, if anything. In any case by the morning they've forgotten all about it.

Re:Nightmares, yeah right (1)

alta (1263) | more than 8 years ago | (#12406125)

Tom, do you have children? I'm just wondering, because that statement doesn't sound like one coming from a parent. I have 11m and a 2.5y sons. I have watched them sleep, and I think people don't give children the credit they deserve.

Re:Nightmares, yeah right (1)

0311 (796591) | more than 8 years ago | (#12406780)

Tom, you don't have children, right? We have 4 children and they have all had, on the rare occasion, the appearance of a person having a nightmare. Restless, aigitated turning, calling out while still asleep, waking and immediately launching into a 4-alarm wail. Are you an experienced child psychologist who specializes in pre-schooler sleep patterns and night mares? No? I wonder if you have any clue as to what you are talking about. It would take a serious series of credentials for your generalization to have any value or weight.

Share and Enjoy... (5, Funny)

Jace of Fuse! (72042) | more than 8 years ago | (#12405965)

"Your plastic pal who's fun to be with!"

Re:Share and Enjoy... (2, Funny)

maxwell demon (590494) | more than 8 years ago | (#12406593)

Well, let's hope they didn't send Marvin. Otherwise the result could be very depressing ...

Re:Share and Enjoy... (1)

Jace of Fuse! (72042) | more than 8 years ago | (#12406716)

In the photo it looks nothing like Marvin.

However, it does seem to have it's head attached upside down.

Ptft.. (5, Funny)

Anonymous Coward | more than 8 years ago | (#12405973)

Anyone scared of what the robots might do has obviously never witnessed the destructive power of the average toddler firsthand.

The robots don't stand a chance.

Re:Ptft.. (0)

dr_dank (472072) | more than 8 years ago | (#12406288)

Yes, but when it comes to finding Sarah Connor, the robot wins, hands down.

Re:Ptft.. (4, Funny)

identity0 (77976) | more than 8 years ago | (#12406690)

This is California, so maybe the Gübernatör is on a mission to train the next generation of resistance fighters to defeat the machines : )

Can't start them too young, I say - let's make sure they can field-strip an AK by the time they're in grade school.

Today Q-rio (1)

Blue Eagle 26 (683113) | more than 8 years ago | (#12406016)

Perhaps a Tachikoma in the future? God I freakin love tachikomas.

It falls down? (1)

k0de (619918) | more than 8 years ago | (#12406047)

"... and help it get up when it falls."

Umm .. how much does that thing weigh? Is a 100-lb tower of metal with a history of knocking itself over really a safe thing to put in a room with a dozen toddlers?

Less than 2 feet - weighs less than you'd imagine. (4, Informative)

reality-bytes (119275) | more than 8 years ago | (#12406432)

QRIO is apparently just a little shorter than 2 feet tall and weighs only 6.5kg (about 14lbs) with its power pack installed.

So, even if the robot went 'dead' and fell rigidly from its full height, it would probably, at worst cause a small bruise to a kids knee.

However, having read a bit on QRIO, the robot knows when it is going to, or is being forcibly overbalanced and takes apropriate action to soften its fall (hands out) and even contort to avoid objects it is falling toward.

let's melt some QrIO (0)

Anonymous Coward | more than 8 years ago | (#12406058)

Re:let's melt some QrIO (0)

Anonymous Coward | more than 8 years ago | (#12406084)

someone changes it's dippers. it's walking like they're full of crap ...

I predict (3, Funny)

ObjetDart (700355) | more than 8 years ago | (#12406097)

After sufficient exposure, the robot will soon realize that it is not the same as the other children. It will then leave the preschool and embark upon an existential quest to be come a human child. Eventually it will realize that this is impossible, and spend the next thousand years moping around the post-apocalyptic landscape, long after all the human children are gone.

Re:I predict (-1)

Anonymous Coward | more than 8 years ago | (#12406207)

Is it named Marvin?

That one in the corner keeps bumming out the kids (1, Funny)

MilenCent (219397) | more than 8 years ago | (#12406100)

That new robot over in the corner is a bad influence.

He keeps bringing the other kids down. All he does is complain about the pain in all his diodes down his left side, about how the kids shouldn't talk to him about life, and making disparaging remarks about their intelligence.

Seems to like kickball, though.

Nanybot :D (2, Funny)

10000000000000000000 (809085) | more than 8 years ago | (#12406142)

[Scene: Roboticon 3003. Leela looks around the robot presentation stands and sees Nannybot 1.0 which looks like a clunky robot version of the aliens from Alien. It holds a baby in it's arms and speaks in a booming voice.]

Nannybot 1.0: Sleep little dumpling. I have replaced your mother.

[It's mouth opens and a bottle of milk comes out on it's tongue. The baby drinks from the bottle.]

Leela: Aww!

Harmoniously?? (3, Insightful)

coffeecan (842352) | more than 8 years ago | (#12406211)

When was the last time ANYTHING was able to live harmoniously with humans. We seem to be able to live harmoniously with ourselves let alone a peice of animated plastic and circutry

Tales of toy robots (5, Funny)

Borg453b (746808) | more than 8 years ago | (#12406222)

Around the age of 6, I was fascinated with spaceships, dinosaurs, racecars and robots. My love for robots resulted in many a robotic toys and I recall one birthday where I was given one of those "autonomous" 30 cm high robots that would move about in patterns, spin and open their chest to expose blazing cannons while making an awful racket. While I thought it cool in its inanimated state, I was terrified of it when it was activated. I would jump on to a stool or a bed and behold it from afar, and ask others to turn it off, when I had enough.

In the end, I had accumulated 3 robots of the sort and I got over my robot-freight. One or two of them, were actually able to fire 4 plastic projectiles, though not on their own. That required me to release a spring based firing mechanism.

When I started attending school, I once invited a friend over. By that time, I was very proud of my robot collection and I would brag, as kids do, about my toys. When telling my new found friend about my robots, I pointed out that one of the robots could fire missiles. In Danish the word missile vaguely (_vaguely_) resembles that of "oranges" (at least to a kid); and so having misheard me and perhaps never having heard the word "missiles" - he wasn't going to give me the impression that his own robot army was inferior to mine, and thus replied that his robots at home could also fire oranges.

In retrospective, the orange caliber is somewhat more impressive than little plastic darts, but back then missiles just sounded cooler than oranges.

Plastic Garbage. (0, Troll)

bigbinc (605471) | more than 8 years ago | (#12406268)

Are you telling me this little plastic toy can play with kids. Is this a Sony advertising stunt or what. Yea, I want to play with the toy dildo-doll that can only move it's hands up and down and make beeping noises. Get bent. The AI in most games are far superior to dildo-tron over here. -10 for japanese toy maker sony.

First class? (0)

Anonymous Coward | more than 8 years ago | (#12406365)

So did Qrio ride over from Japan in a first class seat or was he boxed up with the rest of the animals?

Robots CAN live with humans! (1)

fribhey (731586) | more than 8 years ago | (#12406413)

if we learned one thing form The Jetson's it's that robots make great housekeepers.

Perhaps it's just a toy to them (2, Interesting)

S714726 (875012) | more than 8 years ago | (#12406479)

"...they now dance with it and help it get up when it falls." Don't children do that with toys, like dolls? They may not completely know the difference between this robot and a toy, but I think it's optimistic of Sony to say that the children think of it as a "younger brother."

I fail to see what they are going to prove (2, Informative)

RebRachman (144344) | more than 8 years ago | (#12406496)

I fail to see how this robot is going to prove whether robots can live in harmony with humans. It's like user testing "Reader Rabbit" software and then saying, "Yep, people can work with computer programs."

And while we're on the topic -- don't we already have robotic dogs which seem to work fine with people? This "experiment" has the word "pointless"" written all over it. Even as a publicity stunt it isn't going anywhere. The article was very short and even here on slashdot it's hard to work up any excitement about it.

awesomo (0)

nashy-nunu (860418) | more than 8 years ago | (#12406601)

That reminds me of the South Park Episode of awesomo. LOL
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>