Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Titan Supercomputer Debuts for Open Scientific Research

samzenpus posted about 2 years ago | from the greased-lightning dept.

Supercomputing 87

hypnosec writes "The Oak Ridge National Laboratory has unveiled a new supercomputer – Titan, which it claims is the world's most powerful supercomputer, capable of 20 petaflops of performance. The Cray XK7 supercomputer contains a total of 18,688 nodes and each node is based on a 16-core AMD Opteron 6274 processor and a Nvidia Tesla K20 Graphical Processing Unit (GPU). To be used for researching climate change and other data-intensive tasks, the supercomputer is equipped with more than 700 terabytes of memory."

Sorry! There are no comments related to the filter you selected.

Police officer charged with plan to cook,eat women (-1)

Anonymous Coward | about 2 years ago | (#41805919)

A New York City police officer was charged [yahoo.com] on Thursday with conspiring to kidnap, torture, cook and eat women whose names he listed in his computer. In a criminal complaint unsealed in Manhattan federal court, Gilberto Valle III, 28, of Forest Hills, Queens, was charged with conspiring to cross state lines to kidnap the women and with illegally accessing a federal database. The charges carry a maximum sentence of life in prison.

Re:Police officer charged with plan to cook,eat wo (1)

Anonymous Coward | about 2 years ago | (#41805999)

I don't get the context... are you suggesting that this supercomputer could have predicted this, if used for pre-crime analysis, ala Minority Report?

It's because he used a COMPUTER to plot his crime (1)

Andy Prough (2730467) | about 2 years ago | (#41806705)

Computers are evil - didn't you know?

Re:Police officer charged with plan to cook,eat wo (4, Funny)

Big Hairy Ian (1155547) | about 2 years ago | (#41806037)

Damn if I'd just been a petaflop faster I'd have had 1st post!

Re:Police officer charged with plan to cook,eat wo (1)

Anonymous Coward | about 2 years ago | (#41806151)

isnt a petaflop when some animal righrs activist makes an outrageous claim that everybody laughs at?

Instead of distracting it with 'climate change' (2, Insightful)

fustakrakich (1673220) | about 2 years ago | (#41806149)

Why not have it figure a way of helping us build clean energy sources and reduce contamination? The climate changes all the time. We should learn to live with it.

Re:Instead of distracting it with 'climate change' (0)

Anonymous Coward | about 2 years ago | (#41806495)

Because there are somehow still a lot of stupid people who still need convincing that the climate does indeed change, I guess.

We need fusion.

Re:Instead of distracting it with 'climate change' (0)

Anonymous Coward | about 2 years ago | (#41807253)

Actually having a good predictive understanding of climate change is still really important, especially "if fusion".

If cheap reliable fusion power becomes a reality the first thing that is going to happen is a big time build up of heat as power stations and factories and just about anything else we stick a fusion reactor in will generate.

So yeah, CO2 release may go down but heat release will likely dramatically spike within a 50 year period. So what will that mean to future weather systems??

Re:Instead of distracting it with 'climate change' (1)

Antipater (2053064) | about 2 years ago | (#41807753)

Yeah, because none of our current power sources create any heat.

Re:Instead of distracting it with 'climate change' (3, Insightful)

mcgrew (92797) | about 2 years ago | (#41807727)

I see that the guy who moderated you insightful is as ignorant of computers' workings as you are. Computers don't figure things out. There is no such thing as an "electronic brain" or a "thinking machine." Computers are nothing more than huge electronic abacuses. They don't figure things out, the scientists figure things out (theorize) and then test their theories using computerized models when they can't do direct testing.

Re:Instead of distracting it with 'climate change' (0)

Anonymous Coward | about 2 years ago | (#41808033)

Fuck you.

-Skynet

Re:Instead of distracting it with 'climate change' (0)

Anonymous Coward | about 2 years ago | (#41810971)

Wrong. AI is all about having computers figure things out. They're not very good at it, but they do it ALL the time, and no, it's just not executing the instructions the programmer decided it should execute.

Next time, talk about things you know.

Re:Instead of distracting it with 'climate change' (0)

Anonymous Coward | about 2 years ago | (#41812685)

Wrong. At best AI currently is capable of learning some patterns but they are nowhere near figuring things out. In other words, determining an usual pattern in credit card usage is learning and is not at all the same thing as a computer that could figure out how credit card technology works.

Re:Instead of distracting it with 'climate change' (1)

fustakrakich (1673220) | about 2 years ago | (#41811183)

Jeebus! I'm just saying that climate change has become a bullshit distraction and nothing more than a way to get unlimited funding, but all these people are as hysterical about it as they are about terrorism. We don't need to use megawatts of power to predict what's going to happen 50 or 100 years from now like it's all gonna happen over a five second event. Oh well, I guess I have to assume you didn't get the gist of my original post, which was pretty much do what you can to reduce pollution regardless how the climate changes. Devote the processing to things that matter.

Re:Instead of distracting it with 'climate change' (0)

Anonymous Coward | about 2 years ago | (#41812961)

And again you are wrong. We do need a computer to figure out what's going to happen 50-100 years from now due to climate change. Just look at the (real) news and see how frequently they're updating and correcting predictions. Hint: It's happening faster than originally predicted.

Re:Instead of distracting it with 'climate change' (1)

riverat1 (1048260) | about 2 years ago | (#41819275)

Global warming is a bigger threat to us than terrorism is.

Re:Instead of distracting it with 'climate change' (0)

Anonymous Coward | about 2 years ago | (#41812059)

There is no such thing as a "thinking machine."

Tell that to the people who made the 2-tonne black monstrosity in the loading bay down the hall.

Re:Instead of distracting it with 'climate change' (1)

baldrad (1882464) | about 2 years ago | (#41813667)

Tell that to the people who made the 2-tonne black monstrosity in the loading bay down the hall.

Dude don't be so racist

Re:Instead of distracting it with 'climate change' (0)

Anonymous Coward | about 2 years ago | (#41808981)

As someone who has worked on research using this machine's predecessor, I can say that it is being used to research materials for use in clean energy sources.

Re:Instead of distracting it with 'climate change' (0)

Anonymous Coward | about 2 years ago | (#41814303)

Or why not just make it endlessly calculate new digits of pi so we can be happy by breaking more and more records?

oblig.. (0)

slashmojo (818930) | about 2 years ago | (#41806157)

imagine a beowulf cluster of those!

Newegg does it again! (1)

WOOFYGOOFY (1334993) | about 2 years ago | (#41806159)

I'm waiting for the ShellShocker promo in my email before I upgrade to this baby.

700TB not as exciting as it sounds (4, Interesting)

HappyHead (11389) | about 2 years ago | (#41806171)

The memory they list as an exciting "700+TB" is not actually all that exciting - if you divide that by the number of nodes, and then the number of CPU cores, that leaves only 2GB of ram per CPU core, which is pretty much standard for HPC cluster memory. The only thing impressive about this really, is the number of compute nodes involved, which any single submitted job will _not_ have access to all of. I manage similar, though smaller, research clusters myself, and frankly, the only clusters we had that had less than 2GB per CPU core were retired long ago. Essentially, this means they're running the cluster with the minimum amount of memory that is considered acceptable for the application.

Re:700TB not as exciting as it sounds (2)

K. S. Kyosuke (729550) | about 2 years ago | (#41806667)

Would it be actually useful? Yes, you'd gave more memory in total, but any given amount of memory for a computational job would have constrained bandwidth. As far as I understand it, this is the Achilles' heel of modern machines: What use is a large memory to you when you can barely keep the execution units busy, even with caches? Especially in HPC, whenever the coherence of accesses just isn't there.

Re:700TB not as exciting as it sounds (1)

Anonymous Coward | about 2 years ago | (#41806973)

Not really... some applications (e.g., fluid dynamics simulations) scale just fine.

Re:700TB not as exciting as it sounds (1)

HappyHead (11389) | about 2 years ago | (#41809019)

Larger memory per node is useful when manipulating stupidly huge data sets. Sometimes speed isn't the most important aspect in getting the calculations done, and other factors come into play, like memory size/bandwidth, disk space available, speed of that diskspace, and even network connectivity if you're doing MPI programming.

While I realize it would be great to teach everyone efficient programming techniques, so they could streamline their memory usage down to the bare minimal, it's not always possible, and sometimes it's just not practical to do - our users come from pretty much all disciplines, from Physics, Biology, Chemistry, Engineering, and even a few from History. (Yes, a History researcher using HPC to do calculations and simulations. He's actually doing some pretty neat stuff.) Teaching that diverse a group of people to program super efficiently is not going to work - they're not interested in making super awesome code, they just want their numbers crunched, and are only willing to learn the bare minimal to get it running. The worst cases tend to get assigned a staff member to consult with them and get their code cleaned up so that they don't break the clusters, but with a few thousand users, we can't do that with everybody - most of them would never show up to the classes anyways.

Re:700TB not as exciting as it sounds (1)

gentryx (759438) | about 2 years ago | (#41810163)

If you need more memory, simply allocate more nodes. Problem solved. Hardly anyone needs more than 2 GB/core.

Re:700TB not as exciting as it sounds (0)

Anonymous Coward | about 2 years ago | (#41812281)

That's not how computers work. If you have a giant huge pile of data and need to parse it all as part of the same thread, multiple nodes won't help you, because node A can not read the memory in node B. In that case, if you aren't an expert programmer who can write super optimized MPI code, and need to get your results done _this year_ instead of say, three years from now when you've finally managed to get someone who can write the program to piece things out like that, then you need a giant pile of memory. As was mentioned in the post you were responding to, people who aren't computer scientists don't always want to spend two or three years learning new ways to code before they get on with their research. For them, more memory is the answer, and demanding that they learn to code better so they don't need it is both stupid, and counter productive.

Re:700TB not as exciting as it sounds (3, Informative)

Anonymous Coward | about 2 years ago | (#41807723)

Actually, that's not quite true: it is possible to submit a job request for all 18,688 compute nodes and in fact the scheduling policy gives preference to such large jobs. It's true that there aren't very many applications that can effectively use all that many nodes, but there are a few (such as the global climate simulations). You're correct about the amount of ram per CPU core, though.

Re:700TB not as exciting as it sounds (1)

Marillion (33728) | about 2 years ago | (#41808907)

The XE6 that my team uses allocates jobs reservations at the node level. Each job gets a whole node of 16 cores with 32G ram. If you have a memory intensive task, you only run use as many cores as will fit in the available memory. It's a trade-off: some tasks will waste RAM, some will waste CPUs?

Re:700TB not as exciting as it sounds (2)

gentryx (759438) | about 2 years ago | (#41810103)

Titan is a capability machine which distinguishes it from capacity machines. As such it designed for large/extreme scale jobs (which includes full system runs). I expect the techs are just now prepping Linpack for the next Top500 at SC12.

The ratio of 2 GB/core isn't going away anytime soon. The reason is: a) the speed per core is stagnating, thus adding more memory per core just means that one would end up with more memory per core that it could process in a timely manner and b) if you need more memory, you'll just allocate more nodes, the additional cores you get by that don't exactly hurt.

Imagine a beowu... (0)

Anonymous Coward | about 2 years ago | (#41806211)

It's nice to see they used AMD to build the world's fastest supercomputer this time. I love AMD, but using the most inefficient chip (in terms of energy usage and heat output) to model climate change is kind of ironic.

Re:Imagine a beowu... (1)

Anonymous Coward | about 2 years ago | (#41808693)

Wait, isn't the irony that they used AMD cpus, but Nvidia GPUs in the same system?

Maybe combined with the fact that last I checked the AMD GPUs were lower watts per flop?

Sounds like they made this cluster explicitly to increase the rate of global warming. And by modeling it on the cluster, always be just behind the curve as the cluster tips the balance it's supposed to be simulating :)

(And yes I know that they're probably using Nvidia because the majority of their tasks leverage cuda, but it's still funny!)

Re:Imagine a beowu... (0)

Anonymous Coward | about 2 years ago | (#41809599)

Or maybe because there's still no Tahiti based FireStream available in quantity while Cypress and Cayman *suck* if you're doing nonlinear memory access or data-dependent branching (factor 4-10 slower than a GF100...).

K20 ~ kepler : better perf/watt than GCN/SI (1)

slew (2918) | about 2 years ago | (#41811125)

Although previous generation AMD used to be more flops/watt, the new generation of Kepler GPUs from Nvidia are quite a bit better than AMD's current generation (GCN / southern islands).

http://hothardware.com/Reviews/NVIDIA-GeForce-GTX-680-Review-Kepler-Debuts/?page=15 [hothardware.com]

FYI: The K20 used in the Titan system are Kepler based.

Sirens (0)

X10 (186866) | about 2 years ago | (#41806269)

Does it have sirens?

Is the K20 really a GPU? (2)

CajunArson (465943) | about 2 years ago | (#41806285)

GPU means graphical processing unit. Now consumer GPUs are pressed into service for compute tasks like BOINC & folding, but they are still GPUs (they can still do graphics).

Does Nvidia even bother to put in the graphics-specific silicon and output hardware on the K20, or should these things really be called.. I dunno.. "compute accelerators" or something like that?

Re:Is the K20 really a GPU? (1)

TeknoHog (164938) | about 2 years ago | (#41806573)

General Processing Units

Re:Is the K20 really a GPU? (1)

CajunArson (465943) | about 2 years ago | (#41807415)

That's the problem though... the K20 is definitely not "general" but highly specialized. Throw the right type of problem through optimized CUDA code and it'll run great. Throw it the wrong type of computational problem and it'll go nowhere fast. That's specialized instead of general.

Re:Is the K20 really a GPU? (1)

Shatrat (855151) | about 2 years ago | (#41806579)

Maybe you should consider the origin of the word? http://en.wikipedia.org/wiki/Graph_(mathematics) [wikipedia.org]

Re:Is the K20 really a GPU? (1)

CajunArson (465943) | about 2 years ago | (#41807335)

You have it backwards... graphics have been around since the time of the caveman. "Graph theory" only came into existence in the late 19th century and took its original cues from hand-drawn graphs..which are a type of graphics. Plus, adding and multiplying numbers, which is basically what the K20 does on a huge scale, is by no means an operation that is limited to graph theory.

Re:Is the K20 really a GPU? (1)

Anonymous Coward | about 2 years ago | (#41807691)

With teraflops of single and double precision performance, NVIDIA® Kepler GPU Computing Accelerators are the world’s fastest and most efficient high performance computing (HPC) companion processors. [nvidia.com]

The K20 really is still built on graphics-specific architecture, but it would be a waste to include the output hardware, just think of all the servers that have never had a monitor attached.

Re:Is the K20 really a GPU? (0)

Anonymous Coward | about 2 years ago | (#41811427)

With teraflops of single and double precision performance, NVIDIA® Kepler GPU Computing Accelerators are the world’s fastest and most efficient high performance computing (HPC) companion processors. [nvidia.com]

The K20 really is still built on graphics-specific architecture, but it would be a waste to include the output hardware, just think of all the servers that have never had a monitor attached.

Considering less than 2% of the chip is output hardware, and probably less than 10% of those chips are sold into HPC systems it probably doesn't make sense to take it out. They are only selling a million or so chips to HPC folks, the 10's of millions sold to high-end gamers and CAD workstations would want the output hardware, and they aren't gonna make a special chip for those HPC folks. Doesn't cost any power (since they just turn the display portion of the chip off).

Ironically, the low-end *notebook* version of graphics chips tend to not have any output hardware (after rendering, they just dma the data back to the intel integrated graphics to display). With these low-end chips, there's enough volume to justify taping out a different chip to reduce cost.

On a more serious note (2, Interesting)

WOOFYGOOFY (1334993) | about 2 years ago | (#41806355)

It's a great and important tool for policy makers to be able to crunch this magnitude of data, but not being able to do this is not the problem wrt climate change.

The problem is purely political, specifically, American conservatives are denying this science the same way they deny the science of evolution, the same way they deny the overwhelming proof that smoking causes cancer and second hand smoke does the same, the same way they denied CFCs caused a hole in the ozone layer and risked all our lives on that occasion also.

On the one hand you have hard working, selfless scientists who at this point are sacrificing their personal lives, financial security, their sanity and risking literal criminal prosecution from out-of-control attorneys generals who are drunk blind on power and dieology to continue to speak the truth, Cassandra-fashion, to a heedless and reckless nation.

On the other you have people who have never worked a day in their lives to earn the just authority to advise and inform Congress on this topic nevertheless holding forth, just stealing the authority the other group has worked to earn and effectively screaming "NO FIRE" in a burning theater, inducing people to do nothing when in reality they must do something in order to survive.

The first group fits perfectly my definition of hero .

The second fits perfectly my definition of murderer.

Re:On a more serious note (0)

Anonymous Coward | about 2 years ago | (#41807697)

Anyone who seriously wants to address these issues has to take a factual, undramatic and consistent line. That avoids the personal acrimony that will make even the most level-headed people on the other side willing to listen. Take climate denialism: many, if not most, of the people on the others side are educated, rational, productive members of society. They can be convinced, if you treat them as you would like to be treated.

No matter what the issue is*, calling someone a murderer will not dispose them to listen to you. It will only widen the divide between you and them.

* Except in cases where someone has willfully and unlawfully killed another person.

Re:On a more serious note (4, Interesting)

WOOFYGOOFY (1334993) | about 2 years ago | (#41808767)

You've got to be kidding.

Climate scientists are routinely subjected to death threats t themselves and their families by the people you now claim just want a rational debate. They are subject to politically motivated FOIA searches, public ridicule and accusations that they are lying, corrupt, faking data. They have their emails stolen and their personal lives wrecked through constant harassment. All that is now SOP for the right wing lunatic deniers. The Glenn Becks. The Koch brothers. Murdoch. The Cucinellis. FoxNews employees. Lord Monckton. The Heritage Foundation the religious right Ayn Rand amphetamine addicts et al. ...

So you've got to be kidding.

The rational debate you want has been happening for the past 30 years, in scientific journals and at symposiums and conferences where rational debate on technical matters occurs. Did they join it? Can they understand it, or do they just tell themselves they can? If you can't understand the arguments then you need to listen to experts who can. That's the nature of modern society; that's reality. And when 97% of all qualified experts agree , then that is a bright clear line on the other side of which lies willful and deliberate manslaughter and murder. Just ask any court how it works.

Just because someone can't accept reality doesn't mean they are exempt from the morality that applies to their judgement and actions which issue as a consequence of their reality denial.

If I am thoroughly convinced I can perform brain surgery and fake my way into an operation, I can expect to be prosecuted when I am outed. Ditto Lord Monckton and the Koch Brothers and all the wretched animals at the Heritage Foundation.

It's not about rational debate, it's not about convincing anyone through data or studies or the application scientific method . We know that because, as human society defines that process,- and it is human society that gets to define that process and not the right wing- that has already taken place .

Don't like the outcome? Tough shit.

Here's the game they're playing.: "You can't prove it. " . You can't prove I was lying. You can't prove I didn't believe my own horseshit. You can't prove that I was not perfectly conscious of the fact that the position I was espousing was contrary to reality. I'm safe inside my own brain where only I know the truth. So you can't prosecute me, because I 'm entitled to my opinion.

But you know what ? People make laws. They make laws with the directed and specific purpose of punishing anti-social behavior that harms other innocent humans. We call that behavior "criminal". and it's criminals- through their actions- who ultimately decide what laws we write into existence for the sole purpose of stopping and punishing them.

What these people believe is that law is something which will not pursue them wherever they go, whatever they do irrespective of the real world consequences their actions have on humanity.

Wrong. Dead wrong.

In Nuremberg we hung Germans for breaking laws we made up after the fact - ex post facto lawmaking- specifically to address their crimes. They also thought they were in some sort of legal safe harbor, since what they had done they had done to their own citizens acting as agents of a sovereign nation. And they were right. That is until the day we decided they were wrong. On that day, we made up a new crime -Crimes Against Humanity. Then we tried them for it. Then we hung them for breaking it.

It's criminals who decide what behavior comes to be seen as criminal. The delusion that the law will not, cannot for some reason follow you THERE is just that- a delusion.

Re:On a more serious note (1)

gander666 (723553) | about 2 years ago | (#41809051)

Damn. And my Mod points expired yesterday.

Re:On a more serious note (0)

Anonymous Coward | about 2 years ago | (#41809077)

Be like JFK: avoid calamity by choosing to respond to the moderates, not the extremists.

Re:On a more serious note (0)

Anonymous Coward | about 2 years ago | (#41810639)

Climate scientists are routinely subjected to death threats t themselves and their families by the people you now claim just want a rational debate.

Not every person who has issues with climate change or outright denies it has sent death threats to people. To lump all of those people together doesn't help anything and only will further distance the more moderate of the group from your position.

If I talk to someone about climate issues, I treat them like a well meaning adult until the thoroughly demonstrate otherwise. There are quite a few people that will have polite conversation about the topic, or at least will continue to listen when being nice. They might not have the background to see the contradictions within their own side, but they will clearly see and remember your contradictions with reality by making assumptions about them, even if that has nothing to do with whether your original position is correct or not. And a select few will take a mile if given an inch, use your treatment as a way to convince people that your side is the crazy and wrong one.

Re:On a more serious note (1)

WOOFYGOOFY (1334993) | about 2 years ago | (#41809123)

You need to read history. What yo're saying is exactly what the South said during the run up to the Civil War. What they said was- we are not convinced. We are of the (then scientific / religious) belief that Negroes are not human. We do not believe as you do and you must either convince us of your side through rational arguments or do nothing about slavery.

This is what every cult that wants to impose its alternative-reality on main stream society says. "Hey, you can't prove you're right. We are not convinced., We dissent. We have a right to dissent and a right to live as we see fit .You cannot impose your will upon us! It's immoral.

That's why the South called the Civil War the "War of Norther Aggression". Because they were just living their lives, and conducting their affairs as their belief system bid. Then the North came along and provoked them and would not leave them be.

You need to read history if you think the "we are not convinced, Sir. " meme bear any weight.

Re:On a more serious note (0)

Anonymous Coward | about 2 years ago | (#41809263)

In your two posts and your sig, you equate 'deniers' with
- Nazis
- Slaveholders
- Terrorists

You clearly have no interest in argument. Unless you are implicitly proposing more extreme measures, I am forced to conclude that 'deniers' are simply an focus for the hatred you carry within you. Had you been born in another decade, your hatered may very well have been directed to jews, blacks, or infidels. No matter. As you are an extremist, there is nothing I can do to convince you, so I'll stop trying.

Re:On a more serious note (1)

WOOFYGOOFY (1334993) | about 2 years ago | (#41809605)

Riiiiiiight..... because there's a moral equivalence between people who hate others because they hold contrary opinions on this or that social issue and people who hate others because the actions of those other people lead directly and indisputably to the death of millions of innocent people.

Miss the Days (1)

rwise2112 (648849) | about 2 years ago | (#41806377)

I don't know if I'm alone in this, but I kind of miss the days when supercomputers wern't just clusters of off the shelf components. I feel we've lost something.

Re:Miss the Days (0)

Anonymous Coward | about 2 years ago | (#41806619)

It's simple economics.

Supercomputers of yesteryear no longer make any kind of sense.

Re:Miss the Days (2)

timeOday (582209) | about 2 years ago | (#41807193)

On the other hand, the modern GPU is much closer to the vector units of "classical" supercomputers than anything minis/PCs of that era had.

Re:Miss the Days (1)

rwise2112 (648849) | about 2 years ago | (#41812293)

True enough. I had an account on a Convex C2 supercomputer when I was in university which was very vectorised. In our department, we were able to mount the 9-track tapes ourselves. Yeah, that's how old I am!

I had the same feeling.... (1)

bdwoolman (561635) | about 2 years ago | (#41807523)

But then again it does tell you that the off-the-shelf components we all use are none too shabby. For, as we are all too sick of hearing, the boxes we use right now well outpace those custom-built super computers [wikipedia.org] created in the days of yore. Okay. Maybe not even yore, maybe even less time than that. But still...

Re:Miss the Days (1)

jaharris87 (1599603) | about 2 years ago | (#41813037)

I don't know if I'm alone in this, but I kind of miss the days when supercomputers wern't just clusters of off the shelf components. I feel we've lost something.

HPC is being forced to use off-the-shelf components. There is the funding for R&D of application-specific hardware.

Re:Miss the Days (1)

SecurityGuy (217807) | about 2 years ago | (#41832583)

+1. It's hard to care about a "faster" computer when faster just means more nodes. Wow, how are we ever going to top that one? Just build one with more nodes. It's become much more a question of money than innovative technology.

Oak Ridge (1)

Antipater (2053064) | about 2 years ago | (#41806409)

Rather than another supercomputer, couldn't they spend the money on actually upgrading Oak Ridge's infrastructure so the buildings aren't falling apart, and 80-year-old nuns can't walk through the perimeter fence?

Re:Oak Ridge (2)

jbeaupre (752124) | about 2 years ago | (#41806493)

Don't worry. Any intruder will be scanned and sent to the gaming grid.

Re:Oak Ridge (0)

Anonymous Coward | about 2 years ago | (#41806739)

You must be referring to the neighboring Y-12 facility (not ORNL) regarding old infrastructure and a nun breaking through perimeter fences. ORNL is a completely seperate DOE facility not run by NNSA as Y-12 is.

Re:Oak Ridge (1)

Andy Prough (2730467) | about 2 years ago | (#41806809)

Yeah, those nuns - HUGE problem.

Re:Oak Ridge (1)

SeanAhern (25764) | about 2 years ago | (#41810143)

Wrong plant. ORNL [ornl.gov] is a completely separate laboratory from Y-12 [doe.gov] , even though they're located in the same city.

What everybody wants to know (0)

Anonymous Coward | about 2 years ago | (#41806419)

But can you play Pong on this?

Re:What everybody wants to know (1)

oodaloop (1229816) | about 2 years ago | (#41806633)

What about if it blends, runs linux, or what it does in Soviet Russia?

Is it lighter than a MacBook Air? (1)

Andy Prough (2730467) | about 2 years ago | (#41806789)

Can it play Angry Birds? How many fart apps will run on it? These are the important questions.

Re:Is it lighter than a MacBook Air? (1)

oodaloop (1229816) | about 2 years ago | (#41808073)

Yes. All of them. I agree.

Official Sumper Computing List (1)

nsharifi (2761671) | about 2 years ago | (#41806625)

The official list of Top 500 [top500.org] (last updated 2012/06) states "Sequoia - BlueGene/Q, Power BQC 16C 1.60 GHz, Custom" as the number one super computer. Sequoia is nearly as powerful as Titan.

Which OS? iOS6 or Windows 8? (1)

Andy Prough (2730467) | about 2 years ago | (#41806759)

Cause I mean - their like, so FAST. Right? Right? Surely not grubby old, crufty old Linux - right?

Re:Which OS? iOS6 or Windows 8? (1)

nsharifi (2761671) | about 2 years ago | (#41806819)

You are absolutely wrong. 75% of super computers run on Linux. Go and see [top500.org] .

Re:Which OS? iOS6 or Windows 8? (1)

Andy Prough (2730467) | about 2 years ago | (#41806891)

You are absolutely wrong. 75% of super computers run on Linux. Go and see [top500.org] .

Shocking! Say it ain't so! It must be because nasty old Linux stole all that technology from Bill Gates and Steve Jobs.

Re:Which OS? iOS6 or Windows 8? (0)

Anonymous Coward | about 2 years ago | (#41809725)

No, actually Linus stole it from the likes of Dennis Ritchie and Ken Thompson.
 
Oddly these names rarely ever get mentioned in Linux discussions. I guess it really burns the fanbois to pay respects to the true giants of computing.

Re:Which OS? iOS6 or Windows 8? (1)

Andy Prough (2730467) | about 2 years ago | (#41810879)

No, actually Linus stole it from the likes of Dennis Ritchie and Ken Thompson. Oddly these names rarely ever get mentioned in Linux discussions. I guess it really burns the fanbois to pay respects to the true giants of computing.

Shocking! Say it ain't so! It must be because nasty old Linus stole all that technology from Dennis Ritchie and Ken Thompson and Darl McBride.

Happy now, Darl ... errrr .... I mean, "AC"?

Re:Which OS? iOS6 or Windows 8? (2)

fa2k (881632) | about 2 years ago | (#41808785)

You are absolutely wrong. 75% of super computers run on Linux. Go and see [top500.org] .

I thought that sounded low, so I went and checked at http://i.top500.org/stats [top500.org] . Linux has 92.4 % of the top 500. Then you have "Unix" at 4.8 and Mixed at 2.2.

subsidized GIGO (0)

harvey the nerd (582806) | about 2 years ago | (#41807285)

Giveaway upon giveaway. "Hand tuned" CAGW models have been reality constrained GIGO now for decades because they do not represent a full set of physics [nipccreport.org] . Glad AMD has stuck x86 processors out this far but not so much for this boondoggle.

you FAIl it (-1)

Anonymous Coward | about 2 years ago | (#41807597)

we 4ll know,

What? (1)

ZonkerWilliam (953437) | about 2 years ago | (#41807859)

20 petaflops of performance...

...700 terabytes of memory

Pfffft that all?!

Re:What? (0)

Anonymous Coward | about 2 years ago | (#41812949)

649 TB ought to be enough for anybody...

here is my question. (0)

Anonymous Coward | about 2 years ago | (#41808861)

Why do you need this computer?

Re:here is my question. (0)

Anonymous Coward | about 2 years ago | (#41809083)

To run Crysis on a 30" monitor?

And the OS is? (0)

Anonymous Coward | about 2 years ago | (#41809957)

For the upteempth time we have a gee whiz story about a supercomputer with a bunch of detail on the number of nodes etc. without one mention of the software environment. What? These things run as bare metal with an app on them? Come ON. Lets have abit of detail on how you make use of all those nodes.

My visualizations (3)

SeanAhern (25764) | about 2 years ago | (#41810099)

My favorite part of the article is the photo that accompanies it. Two of my scientific visualizations are on there, the red/yellow picture of an Alzheimer's plaque being attacked by drugs (behind the N of TITAN) and the silver structure of a proposed ultra-capacitor made from nanotubes (to the right of the N).

Re:My visualizations (0)

Anonymous Coward | about 2 years ago | (#41811391)

Yes, the plaque and ultra-capacitor are nice... but that climate image with the hurricane spinning torwards India is killing it!

More proof of Government waste (0)

Anonymous Coward | about 2 years ago | (#41810603)

To be used for researching climate change

Everyone knows that climate change is a scare tactic of the liberal left leaning mainstream media. The 'story' is being propagated to try to steal money from the job creators and, along with Obama-care, siphon the money off to teenage immigrant welfare mothers on drugs in order to buy their votes in future elections. But we know that there is no climate change, it's a fabricated 'story' created by deceitful intellectuals who make their living by taking some of the same government money that rightly belongs to the wealthy while providing no return to the economy. The money would be better spent by the true American job creators to drill for more oil!

At least that's what talk radio and Fox News tell me to think.

XK6 not XK7 (0)

Anonymous Coward | about 2 years ago | (#41813025)

Titan will be a Cray XK6, not XK7. It uses the same chip architecture as an XE6, just replacing a single CPU with a GPU.

Slashdot Throwback sarcasm! (1)

jameshofo (1454841) | about 2 years ago | (#41830203)

I wonder how that would do in a Beowulf cluster!
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?