Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.
Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and learn more about it. Thanks for reading, and for making the site better!
MojoKid writes Have you ever tried hunting and pecking on a miniature keyboard that's been crammed onto a smartwatch's tiny display? Unless the tips of your fingers somehow resemble that of a stylus, you're in for a challenge. Interestingly enough, it's Microsoft that might have the most logical solution for typing on small size displays running Google's Android Wear platform. Microsoft's research division has built an analog keyboard prototype for Android Wear that eliminates the need to tap at tiny letters, and instead has you write them out. On the surface, such a solution seems like you'd be trading one tedious task for another, though a demo of the technology in action shows that this could be a promising solution — watch how fast the guy in the video is able to hammer out a response.
100 comments | about a week ago
An anonymous reader writes Chrome OS is based on the Linux kernel and designed by Google to work with web applications and installed applications. Chromebook is one of the best selling laptops on Amazon. However, devs decided to drop support for ext2/3/4 on external drivers and SD card. It seems that ChromiumOS developers can't implement a script or feature to relabel EXT volumes in the left nav that is insertable and has RW privileges using Files.app. Given that this is the main filesystem in Linux, and is thereby automatically well supported by anything that leverages Linux, this choice makes absolutely no sense. Google may want to drop support for external storage and push the cloud storage on everyone. Overall Linux users and community members are not happy at all.
344 comments | about a week ago
An anonymous reader writes Buried toward the end of the must-watch keynote by Oculus VR's Chief Scientist, Michael Abrash, was the announcement of a new research division within Oculus which Abrash says is the "first complete, well funded VR research team in close to 20 years." He says that their mission is to advance VR and that the research division will publish its findings and also work with university researchers. The company is now hiring "first-rate programmers, hardware engineers, and researchers of many sorts, including optics, displays, computer vision and tracking, user experience, audio, haptics, and perceptual psychology," to be part of Oculus Research.
16 comments | about two weeks ago
Chris Gordon works for a high-technology company, but he likes analog meters better than digital readouts. In this video, he shows off a bank of old-fashioned meters that display data acquired from digital sources. He says he's no Luddite; that he just prefers getting his data in analog form -- which gets a little harder every year because hardly any new analog meters are being manufactured. (Alternate Video Link)
155 comments | about two weeks ago
rastos1 writes: In a recent blog, software developer Bruce Dawson pointed out some issues with the way the FSIN instruction is described in the "Intel® 64 and IA-32 Architectures Software Developer's Manual," noting that the result of FSIN can be very inaccurate in some cases, if compared to the exact mathematical value of the sine function.
Dawson says, "I was shocked when I discovered this. Both the fsin instruction and Intel's documentation are hugely inaccurate, and the inaccurate documentation has led to poor decisions being made. ... Intel has known for years that these instructions are not as accurate as promised. They are now making updates to their documentation. Updating the instruction is not a realistic option."
Intel processors have had a problem with math in the past, too.
238 comments | about two weeks ago
An anonymous reader writes: Computer hardware site AnandTech has posted a detailed introduction to semiconductor technology. It's deep enough to be insightful for understanding the chips that run your devices and the industry that built them, but also short enough that your eyes won't start bleeding in the process. The article starts by explaining why silicon is so important, and how a board is set up, structurally. Then it walks through transistor design and construction, and the underpinnings of CMOS logic. Finally, the article describes the manufacturing steps, including wafer creation, photolithography, and how metal is added/shaped at the end. They then go into the physics behind improving these components. It's a geeky and informative read.
21 comments | about two weeks ago
mikejuk writes The Amazon Picking Challenge at ICRA (IEEE Robotics and Automation) 2015 is about getting a robot to perform the picking task. All the robot has to do is pick a list of items from the automated shelves that Amazon uses and place the items into another automated tray ready for delivery. The prizes are $20,000 for the winner, $5000 for second place and $1000 for third place. In addition each team can be awarded up to $6000 to get them and their robot to the conference so that they can participate in the challenge. Amazon is even offering to try to act as matchmaker between robot companies and teams not having the robot hardware they need. A Baxter Research Robot will be made available at the contest.
106 comments | about two weeks ago
The Indiegogo crowdfunding campaign for an open-hardware cinema camera has closed far in the black, though the project continues to accept contributions. The Axiom's designers raised enough (€174,520, topping their €100,000 goal) to fund development of their stretch goals (remote control, active lens mount, active battery mount), and then some. If it actually gets built and catches on, it will be interesting to see what custom modules users come up with.
31 comments | about two weeks ago
MojoKid (1002251) writes A new interview with Assassin's Creed Unity senior producer Vincent Pontbriand has some gamers seeing red and others crying "told you so," after the developer revealed that the game's 900p framerate and 30 fps target on consoles is a result of weak CPU performance rather than GPU compute. "Technically we're CPU-bound," Pontbriand said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU that has to process the AI, the number of NPCs we have on screen, all these systems running in parallel. We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise..." This has been read by many as a rather damning referendum on the capabilities of AMD's APU that's under the hood of Sony's and Microsoft's new consoles. To some extent, that's justified; the Jaguar CPU inside both the Sony PS4 and Xbox One is a modest chip with a relatively low clock speed. Both consoles may offer eight CPU threads on paper, but games can't access all that headroom. One thread is reserved for the OS and a few more cores will be used for processing the 3D pipeline. Between the two, Ubisoft may have only had 4-5 cores for AI and other calculations — scarcely more than last gen, and the Xbox 360 and PS3 CPUs were clocked much faster than the 1.6 / 1.73GHz frequencies of their replacements.
337 comments | about two weeks ago
An anonymous reader writes The Oculus Rift DK2 VR headset hides under its IR-transparent shell an array of IR LEDs which are picked up by the positional tracker. The data is used to understand where the user's head is in 3D space so that the game engine can update the view accordingly, a critical function for reducing sim sickness and increasing immersion. Unsurprisingly, some endeavoring folks wanted to uncover the magic behind Oculus' tech and began reverse engineering the system. Along the way, they discovered some curious info including a firmware bug which, when fixed, revealed the true view of the positional tracker.
26 comments | about two weeks ago
alphadogg (971356) writes A startup called Antumbra run by 5 college students is looking to throw a little soothing light on this situation: People who hunker down in front of their computers until the wee hours, until it feels like their eyes might fall out. Antumbra's open-source-based Glow, which launches in a limited beta of 100 $35 units on Thursday, is a small (1.5" x 1.5"x 0.5") doohickey that attaches to the back of your computer monitor via USB port and is designed to enhance your work or gaming experience — and lessen eye strain — by spreading the colors from your screen onto the wall behind it in real time. The idea is to reduce the contrast in colors between the computer screen and the background area. The the idea might not be new, and people have been home-brewing their own content-driven lighting like this for a while, but this is the first I've seen that looks like a simple add-on.
43 comments | about two weeks ago
vinces99 writes Fusion energy almost sounds too good to be true – zero greenhouse gas emissions, no long-lived radioactive waste, a nearly unlimited fuel supply. Perhaps the biggest roadblock to adopting fusion energy is that the economics haven't penciled out. Fusion power designs aren't cheap enough to outperform systems that use fossil fuels such as coal and natural gas. University of Washington engineers hope to change that. They have designed a concept for a fusion reactor that, when scaled up to the size of a large electrical power plant, would rival costs for a new coal-fired plant with similar electrical output. The team published its reactor design and cost-analysis findings last spring and will present results Oct. 17 at the International Atomic Energy Agency's Fusion Energy Conference in St. Petersburg, Russia.
315 comments | about two weeks ago
HughPickens.com writes Adi Robertson argues that IBM's Model M keyboard, soon to turn 30 is still the only keyboard worth using for many people. Introduced in 1985 as part of the IBM 3161 terminal, the Model M was initially called the "IBM Enhanced Keyboard." A PC-compatible version appeared the following spring, and it officially became standard with the IBM Personal System / 2 in 1987. The layout of the Model M has been around so long that today it's simply taken for granted, but the keyboard's descendants have jettisoned one of the Model M's most iconic features — "buckling springs," designed to provide auditory and tactile feedback to the keyboard operator. "Model M owners sometimes ruefully post stories of spouses and coworkers who can't stand the incessant chatter. But fans say the springs' resistance and their audible "click" make it clear when a keypress is registered, reducing errors," writes Robertson. "Maybe more importantly, typing on the Model M is a special, tangible experience. Much like on a typewriter, the sharp click gives every letter a physical presence."
According to Robertson, the Model M is an artifact from a time when high-end computing was still the province of industry, not pleasure. But while today's manufacturers have long since abandoned the concept of durability and longevity, refurbished Model Ms are still available from aficionados like Brandon Ermita, a Princeton University IT manager who recovers them from supply depots and recycling centers and sells them through his site, ClickyKeyboards. "For the very few that still appreciate the tactile feel of a typewriter-based computer keyboard and can still appreciate the simplicity of black letters on white keys, one can still seek out and own an original IBM model M keyboard — a little piece of early computing history," says Ermita. As one Reddit user recently commented, "Those bastards are the ORIGINAL gaming keyboards. No matter how much you abuse it, you'll die before it does.""
304 comments | about two weeks ago
MojoKid writes: When Nvidia launched their new GeForce GTX 980 and 970 last month, it was obvious that these cards would be coming to mobile sooner rather than later. The significant increase that Maxwell offers in performance-per-watt means that these GPUs should shine in mobile contexts, maybe even more-so than in desktop. Today, Nvidia is unveiling two new mobile GPUs — the GeForce GTX 980M and 970M. Both notebook graphics engines are based on Maxwell's 28nm architecture, and both are trimmed slightly from the full desktop implementation. The GTX 980M is a 1536-core chip (just like the GTX 680 / 680M) while the GTX 970 will pack 1280 cores. Clock speeds are 1038MHz base for the GTX 980M and 924MHz for the GTX 970M, which is significantly faster than the previous gen GTX 680M's launch speeds. The 980M will carry up to 4GB of RAM, while the 970M will offer 3GB and a smaller memory bus.
From eyeballing relative performance expectations, the GTX 970M should be well-suited to 1080p or below at high detail levels, while the GTX 980M should be capable of ultra detail at 1080p or higher resolutions. Maxwell's better efficiency means that it should offer a significant performance improvement over mobile Kepler, even with the same number of cores. Also with this launch Nvidia is introducing "Battery Boost" as a solution for games with less demanding graphics, where battery life can be extended by governing clock speeds to maintain playable frames, without overpower the GPU at higher than needed frame rates.
29 comments | about two weeks ago
jfruh writes: Traditional LCD panels are rectangular because the tiny chips that drive each pixel of the display are fitted along the edge of the glass panel on which the screen is made. But in a new breed of screens from Sharp, the chips are embedded between the pixels so that means a lot more freedom in screen shape: only one edge of the screen needs to be a straight line, which could give rise to a host of new applications.
60 comments | about two weeks ago
dcblogs writes: "Gartner predicts one in three jobs will be converted to software, robots and smart machines by 2025," said Peter Sondergaard, Gartner's research director at its big Orlando conference. "New digital businesses require less labor; machines will make sense of data faster than humans can," he said. Smart machines are an emerging "super class" of technologies that perform a wide variety of work, both the physical and the intellectual kind. Machines, for instance, have been grading multiple choice test for years, but now they are grading essays and unstructured text. This cognitive capability in software will extend to other areas, including financial analysis, medical diagnostics and data analytic jobs of all sorts, says Gartner. "Knowledge work will be automated."
405 comments | about two weeks ago
Lasrick writes Dawn Stover looks at unrealistic expectations and the distribution of limited energy resources: 'This is a question that should move from the fringes of the energy debate to its very heart. Economists and energy experts shy away from issues of equity and morality, but climate change and environmental justice are inseparable: It's impossible to talk intelligently about climate without discussing how to distribute limited energy resources. It's highly unlikely that the world can safely produce almost five times as much electricity by 2035 as it does now—which is what it would take to provide everyone with a circa-2010 American standard of living, according to a calculation by University of Colorado environmental studies professor Roger Pielke Jr. The sooner policy makers accept this reality, the sooner they can get to work on a global solution that meets everyone's needs. First, though, they need to understand the difference between needs and wants.' Not something most people even think about.
652 comments | about two weeks ago
The Wall Street Journal reports in a paywalled article that a team under Pixel Qi founder and OLPC co-founder Mary Lou Jepsen at Google's skunkwork labs Google X is working on modular video displays that could be expanded by snapping them together "like Lego." Ars Technica, TechSpot, The Verge, and several others summarize the claims made by "three people familiar with the project"; here's a snippet from TechSpot's version: Even in the home and office, the use of multiple displays isn’t uncommon but just like with larger implementations often used for advertising purposes, screen bezels are always a problem. Bezels are less visible from a distance but up close, they pretty much ruin the experience. The scope and target audience for the project is unclear at this hour as we are told the project is currently in an early stage. One of the biggest challenges is figuring out how to stitch images together across screens, both electronically and through software.
56 comments | about two weeks ago
curtwoodward writes: Ian Wright knows how to build high-performance electric cars: he was a co-founder at Tesla Motors and built the X1, a street-legal all-electric car that can go from zero to 60 in 2.9 seconds. But he only cares about trucks now — in fact, boring old garbage trucks and delivery trucks are his favorite. Why? To disrupt the auto industry with electrification, EV makers should target the biggest gas (and diesel) guzzlers. His new powertrain is very high tech, combining advanced electric motors with an onboard turbine that acts as a generator when batteries run low.
174 comments | about two weeks ago
An anonymous reader writes William Steptoe, a senior researcher in the Virtual Environments and Computer Graphics group at University College London, published a paper (PDF) detailing experiments dealing with the seamless integration of virtual objects into a real scene. Participants were tested to see if they could correctly identify which objects in the scene were real or virtual. With standard rendering, participants were able to correctly guess 73% of the time. Once a stylized rendering outline was applied, accuracy dropped to 56% (around change) and even further to 38% as the stylized rendering was increased. Less accuracy means users were less able to tell the difference between real and virtual objects. Steptoe says that this blurring of real and virtual can increase 'presence', the feeling of being truly present in another space, in immersive augmented reality applications.
75 comments | about two weeks ago