Archive | September 2013

Keeping Your Laptop Plugged in All the Time Will Kill Its Battery Faster

Keeping Your Laptop Plugged in All the Time Will Kill Its Battery Faster

Power down. Photo: Ariel Zambelich/WIRED

Laptops are our indispensable lifeline to the majesty that is the Internet. We use them to work and play from anywhere in the world. But if you’re like most people, you probably keep yours plugged in when you’re at work or home. Stop doing that.

In order to squeeze as much life out of your lithium-polymer battery, once your laptop hits 100 percent, unplug it. In fact, you should unplug it before that.

Cadex Electronics CEO Isidor Buchmann told WIRED that ideally everyone would charge their batteries to 80 percent then let them drain to about 40 percent. This will prolong the life of your battery — in some cases by as much as four times. The reason is that each cell in a lithium-polymer battery is charged to a voltage level. The higher the charge percentage, the higher the voltage level. The more voltage a cell has to store, the more stress it’s put under. That stress leads to fewer discharge cycles. For example, Battery University states that a battery charged to 100 percent will have only 300-500 discharge cycles, while a battery charged to 70 percent will get 1,200-2,000 discharge cycles.

Buchmann would know. His company Cadex sponsers Battery University. The site is the go-to destination for anyone interested in battery technology. And it’s not just constant power that shortens your battery’s life. While batteries degrade naturally, heat also accelerates the degradation. Extreme heat can cause the cells to expand and bubble. Kyle Wiens of iFixit told WIRED: “Too much heat to the battery over time, and the battery isn’t going to last as long.”

You can battle this degradation by keeping the lid open and your laptop out of your actual lap while using it.

While those are simple fixes, Buchmann admits that putting the 40 to 80 percent battery-status workflow into practice is easier said than done. Keeping an eye on your computer’s battery level while trying to work can be a pain. “The ideal would be that the laptop would only charge 80 percent,” Buchmann says, “and if you had to travel, you could push a button before you travel to charge it to 100 percent.”

A search of Windows and OS X apps yielded nothing that would alert a user when a computer reached both an 80 percent charge and a 40 percent discharge. A quick DIY solution is to measure how long it takes to go from 80 percent to 40 percent then set a timer. Do the same thing as it charges from 40 percent to 80 percent. If it saves you money and keeps your battery healthy, it’s worth it.

 

 

Advertisements

Parallax and the iOS 7 Wallpaper

Parallax and the iOS 7 Wallpaper

Photo: Josh Valcarcel/WIRED
If you have updated to Apple’s new iPhone operating system (iOS 7), you might have noticed something different. When you are on the home screen, the icons appear to shift a little bit as you move the phone around. It’s sort of annoying and cool at the same time. It produces the illusion that the icons are floating above the background wallpaper.

Parallax

This is a classic example of parallax (although this is fake parallax). You can create another example of parallax yourself. Hold you thumb out straight in front of your face at arms length. Now close one eye and find an object in the distance to line your thumb up with. Ready? Now switch the eye that was closed. Notice anything? Your thumb won’t be lined up with that distant object any more. Oh, you didn’t notice anything? Do it again and pay attention.  Here is a diagram showing how a person’s two eyes see the thumb at different angles.

Different objects will have different angular shifts depending on their distance from the observation location. Here is an older post with some more pictures.

But is parallax useful? Yes. Parallax can be used to find the distance to some of the closer stars. As the Earth moves from one side of the Sun to the other (during a 6 month time interval), our observation location has shifted by twice our orbital radius. This shift is small compared to the distance to the stars, so it’s a difficult task. However, it can be used with some success.

Distance to the Wallpaper

For the iPhone, this is just a little tiny bit different than parallax. Normally, you have a known change in observation distance and then use the apparent change in angular position to find the distance to that object. In this case, the observation location (your face) doesn’t move. Instead, the phone tilts. However, this is exactly the same as if the phone was stationary and your head moved. A stationary phone makes for a simpler diagram.

Fall 2013 Sketches.key

Really, I don’t even need the top triangle in this case. I can measure the change in viewing angle and I can also measure the background shift (s1). If I assume the distance s1 is close to the same distance as the arc-length then the following would be true.

La te xi t 1

The greater the background shift, the further away it is from the front icons. How is this different from parallax in astronomy? In that case, you would measure the change in angular position and use the arc-length to find the observation distance. So, it’s mostly the same.

But how do I determine the value of s1? That’s fairly simple. I just measure it with a ruler and some screen shots. Here are two screen views at different angles.

Fall 2013 Sketches.key

Notice the shift in the red lines in the background (I took a picture of lines just so that this would be easier to see). If I use a real life ruler, the distance from the top red line to the bottom one is 4.4 cm. I can use this scale to measure the shift in one of the red lines. This measurement could be accomplished with a drawing program and converting pixels to cm or you could load this image into Tracker Video Analysis. For this case, I get a background shift of 0.31 cm.

What about the change in viewing angle? Well, it turns out that the compass app in iOS 7 has a built in angle measurement. Yes, it’s on the “second” page of the app. Now, it’s not perfect but I tried to hold the phone steady to switch apps. Here’s what I get.

I Photo

For the other angle, I get a tilt of 46°. Let’s just call this a angle change of 90°. Now I can put my values in to get the distance from the background to the icons. Remember that I need the angle in units of radians instead of degrees.

La te xi t 1

That’s not too far back – and that’s a good thing. You know why? Because I made another assumption that I didn’t disclose. I assumed that the measurements on the background were the actual distances and not the apparent distances. Just image in the background was 1 meter behind the icons. If this were the case, my measurements wouldn’t really tell you the distance the background shifted. It would give the apparent shift. With a distance of just 0.197 cm, the difference between apparent and actual lengths is small enough to ignore.

You should probably repeat my experiment and see if you get the same values. Maybe you could find your own way to measure this background shift.

Further Study

Here are some other things to consider.

  • Since this is just an estimate, what is the uncertainty in the icon-wallpaper distance?
  • How big of a parallax shift would there be if the icons were on the top of the front glass on the phone and the wallpaper was on the inside of the back of the phone (like an empty box)?  For the iPhone 5, this seems to be a distance of about 0.7 cm.
  • In my quick calculation, I just used two data points.  What if you measured the background position for many different angles?  What should a plot of wallpaper shift vs. angle look like?  Does this phone parallax actually look like it should?
  • For this example, I tilted the phone forward and backward.  What about a side-to-side tilt?  Does that have the same parallax effect?
  • Here is the most important thing to do.  Build some type of model of the icon-wallpaper system such that the wallpaper shifts just like the iPhone effect.  You could probably reproduce this with a Lego background and lifted Lego icons.

Notice that I listed these as “further study” instead of my usual homework.  This isn’t homework (though you can still do it if you like).  These are notes to my future self so I can finish what I started.

 

 

 

chris chinault

Now You Can Rent a Tesla Model S

Now You Can Rent a Tesla Model S

Traveling to S.F. or L.A.? Now you can show off your impeccable taste in performance and green cred by renting a Tesla Model S from Hertz.

The car rental company has added Tesla’s all-electric wünder-sedan to its “Dream Cars” fleet in California, and it didn’t just skimp on the 60 kWh model, but are offering the fully charged 85 kWh version with a 265 mile range and a 0-60 time of four seconds.

The Model S joins an impressive list of other Dream Cars from Hertz, including the Aston Martin Vantage, Audi R8, Bentley Continental GT, Ferrari F430, Lamborghini Gallardo, SRT Viper, and Porsche’s 911, Boxster, and Panamera. All of those rides and more are available in 35 special rental fleets from Miami to Las Vegas, but the Model S is a California-only affair for now, and commands $400 per day — better than a last generation Ferrari for $1,200. Regardless, get the expanded insurance and liability coverage.

 

chris chinault

How a Crypto ‘Backdoor’ Pitted the Tech World Against the NSA

How a Crypto ‘Backdoor’ Pitted the Tech World Against the NSA

Illustration: alengo/Getty Images
In August 2007, a young programmer in Microsoft’s Windows security group stood up to give a five-minute turbo talk at the annual Crypto conference in Santa Barbara.

It was a Tuesday evening, part of the conference’s traditional rump session, when a hodge-podge of short talks are presented outside of the conference’s main lineup. To draw attendees away from the wine and beer that competed for their attention at that hour, presenters sometimes tried to sex up their talks with provocative titles like “Does Bob Go to Prison?” or “How to Steal Cars – A Practical Attack on KeeLoq” or “The Only Rump Session Talk With Pamela Anderson.”

Dan Shumow and his Microsoft colleague Niels Ferguson titled theirs, provocatively, “On the Possibility of a Back Door in the NIST SP800-90 Dual Ec Prng.” It was a title only a crypto geek would love or get.

The talk was only nine slides long (.pdf). But those nine slides were potentially dynamite. They laid out a case showing that a new encryption standard, given a stamp of approval by the U.S. government, possessed a glaring weakness that made an algorithm in it susceptible to cracking. But the weakness they described wasn’t just an average vulnerability, it had the kind of properties one would want if one were intentionally inserting a backdoor to make the algorithm susceptible to cracking by design.

For such a dramatic presentation — by mathematicians’ standards — the reaction to it was surprisingly muted. “I think folks thought, ‘Well that’s interesting,’ and, ‘Wow, it looks like maybe there was a flaw in the design,’” says a senior Microsoft manager who was at the talk. “But there wasn’t a huge reaction.”

Six years later, that’s all changed.

Early this month the New York Times drew a connection between their talk and memos leaked by Edward Snowden, classified Top Secret, that apparently confirms that the weakness in the standard and so-called Dual_EC_DRBG algorithm was indeed a backdoor. The Times story implies that the backdoor was intentionally put there by the NSA as part of a $250-million, decade-long covert operation by the agency to weaken and undermine the integrity of a number of encryption systems used by millions of people around the world.

The Times story has kindled a firestorm over the integrity of the byzantine process that produces security standards. The National Institute of Standards and Technology, which approved Dual_EC_DRBG and the standard, is now facing a crisis of confidence, having been forced to re-open the standard for public discussion, while security and crypto firms scramble to unravel how deeply the suspect algorithm infiltrated their code, if at all. On Thursday, corporate giant RSA Security publicly renounced Dual_EC_DRBG, while also conceding that its commercial suite of cryptographic libraries had been using the bad algorithm as its default algorithm for years.

But beneath the flames, a surprising uncertainty is still smoldering over whether Dual_EC_DRBG really is backdoored. The Times, crypto experts note, hasn’t released the memos that purport to prove the existence of a backdoor, and the paper’s direct quotes from the classified documents don’t mention any backdoor in the algorithm or efforts by the NSA to weaken it or the standard. They only discuss efforts to push the standard through committees for approval.

Jon Callas, the CTO of Silent Circle, whose company offers encrypted phone communication, delivered a different rump session talk at the Crypto conference in 2007 and saw the presentation by Shumow. He says he wasn’t alarmed by it at the time and still has doubts that what was exposed was actually a backdoor, in part because the algorithm is so badly done.

“If [NSA] spent $250 million weakening the standard and this is the best that they could do, then we have nothing to fear from them,” he says. “Because this was really ham-fisted. When you put on your conspiratorial hat about what the NSA would be doing, you would expect something more devious, Machiavellian … and this thing is just laughably bad. This is Boris and Natasha sort of stuff.”

Indeed, the Microsoft presenters themselves — who declined to comment for this article — didn’t press the backdoor theory in their talk. They didn’t mention NSA at all, and went out of their way to avoid accusing NIST of anything. “WE ARE NOT SAYING: NIST intentionally put a back door in this PRNG,” read the last slide of their deck.

The Microsoft manager who spoke with WIRED on condition of anonymity thinks the provocative title of the 2007 presentation overstates the issue with the algorithm and is being misinterpreted — that perhaps reporters at the Times read something in a classified document showing that the NSA worked on the algorithm and pushed it through the standards process, and quickly took it as proof that the title of the 2007 talk had been right to call the weakness in the standard and algorithm a backdoor.

But Paul Kocher, president and chief scientist of Cryptography Research, says that regardless of the lack of evidence in the Times story, he discounts the “bad cryptography” explanation for the weakness, in favor of the backdoor one.

“Bad cryptography happens through laziness and ignorance,” he says. “But in this case, a great deal of effort went into creating this and choosing a structure that happens to be amenable to attack.

“What’s mathematically creative [with this algorithm] is that when you look at it, you can’t even prove whether there is a backdoor or not, which is very bizarre in cryptography,” he says. “Usually the presence of a backdoor is something you can prove is there, because you can see it and exploit it…. In my entire career in cryptography, I’ve never seen a vulnerability like this.”

 

National Security Agency headquarters, Fort Meade, Maryland. Photo: Wikipedia

It’s not the first time the NSA has been accused of installing backdoors. Crypto trapdoors, real and imagined, have been part of NSA lore for decades. In some ways the current controversy echoes the long-ago debate over the first U.S. Data Encryption Standard in the 1970s. The NSA was widely suspected of weakening DES to make it more crackable by the agency by tinkering with a table of numeric constants called an S-Box and shortening the algorithm’s key length. In 1994, though, the NSA was exonerated when it turned out that the agency had actually changed the S-Box numbers to harden DES against a code-breaking technique that had been known only within NSA at the time.

In 1995, another case came up that seemed to confirm suspicions about the NSA. The Baltimore Sun reported that year that the NSA had inserted a backdoor into cryptographic machines made by the respected Swiss company Crypto AG, apparently substantiating longstanding rumors to that effect.

Then in 1999, Microsoft inadvertently kicked off another controversy when it leaked its internal name for a cryptographic signing key built into Windows NT. The key was called _NSAKEY, spawning speculation that Microsoft had secretly given the agency the power to write and sign its own updates to Windows NT’s crypto engine. Microsoft said this was incorrect, that the key was an internal Microsoft key only and that it was called “_NSAKEY” because the NSA was the technical reviewing authority for U.S. export controls. The key was part of Microsoft’s compliance with U.S. export laws.

Suspicions about the NSA and backdoors were lingering in 2006 when Shumow and Ferguson began looking at Dual_EC_DRBG after NIST approved it for inclusion in a standard (.pdf). The standard discussed four federally sanctioned random number generators approved for use in encrypting government classified and unclassified-but-sensitive communication.

Each of the four algorithms was based on a different cryptographic design family. One was based on hash functions, one on so-called HMAC (hash-based message authentication code), one on block ciphers and the fourth one was based on elliptic curves. The NSA had been pushing elliptic curve cryptography for a number of years, and it publicly championed the last one — Dual_EC_DRBG — to be included in the standard.

Elliptic curve algorithms are based on slightly different mathematics than the more common RSA algorithm, and the NSA believes they’re the future of cryptography, asserting that elliptic curve algorithms are smaller, faster and offer better security.

But as Shumow and Ferguson examined the properties of the elliptic curve random number generator in the standard, to determine how to incorporate it into the Windows operating system, a couple of strange things stood out. First, the random number generator was very slow – two to three orders of magnitude slower than another algorithm in the standard.

Second, it didn’t seem to be very secure.

“There was a property [in it] that seemed to make the prediction-resistance of the algorithm not what you would necessarily want it to be,” the Microsoft manager says. In non-geek speak, there was a weakness that made the random number generator not so random.

Good random number generation is at the core of encryption, and a weak RNG can undo the entire encryption system. Random number generators play a role in creating cryptographic keys, in opening secure communications between users and web sites and in resetting passwords for email accounts. Without assured randomness, an attacker can predict what the system will generate and undermine the algorithm.

Shumow and Ferguson found that the obstacles to predicting what the random number generator would generate were low. It wasn’t a catastrophic problem, but it seemed strange for a security system being promulgated by the government.

Then they noticed something else.

The standard, which contained guidelines for implementing the algorithm, included a list of constants – static numbers – that were used in the elliptic curve on which the random number generator was based. Whoever generated the constants, which served as a kind of public key for the algorithm, could have generated a second set of numbers at the same time – a private key.

Anyone possessing that second set of numbers would have what’s known in the cryptography community as “trapdoor information” – that is, they would be able to essentially unlock the encryption algorithm by predicting what the random number generator generated. And, Shumow and Ferguson realized, they could predict this after seeing as few as 32 bytes of output from the generator. With a very small sample, they could crack the entire encryption system used to secure the output.

“Even if no one knows the secret numbers, the fact that the backdoor is present makes Dual_EC_DRBG very fragile,” cryptographer Bruce Schneier wrote at the time, in a piece for WIRED. “If someone were to solve just one instance of the algorithm’s elliptic-curve problem, he would effectively have the keys to the kingdom. He could then use it for whatever nefarious purpose he wanted. Or he could publish his result, and render every implementation of the random-number generator completely insecure.”

No one knew who had produced the constants, but it was assumed that because the NSA had pushed the algorithm into the standard, the agency had generated the numbers. The spy agency might also, then, have generated a secret key.

Schneier called it “scary stuff indeed,” but he also said at the time that it made no sense as a backdoor, since it was so obvious to anyone who looked at the algorithm and standard that there was this flaw in it. As a result, developers of web sites and software applications wouldn’t use it to help secure their products and systems, he said.

But in fact, many developers did use it.

The U.S. government has enormous purchasing power, and vendors soon were forced to implement the suspect standard as a condition of selling their products to federal agencies under so-called FIPS certification requirements. Microsoft added support for the standard, including the elliptic curve random-number generator, in a Vista update in February 2008, though it did not make the problematic generator the default algorithm.

Asked why Microsoft supported the algorithm when two of its own employees had shown it to be weakened, a second Microsoft senior manager who spoke with WIRED said that while the weakness in the algorithm and standard was “weird” it “wasn’t a smoking gun.” It was more of an “odd property.”

Microsoft decided to include the algorithm in its operating system because a major customer was asking for it, because it had been sanctioned by NIST, and because it wasn’t going to be enabled as the default algorithm in the system, thus having no impact on other customers.

“In fact it is nearly impossible for any user to implement or to get this particular random number generator instantiating on their machines without going into the guts of the machine and reconfiguring it,” he says.

Other major companies, like Cisco and RSA, added it as well. NIST in fact provides a lengthy list of companies that have included it in their libraries, though the list doesn’t say which companies made it the default algorithm in their library or which products have been developed that invoke the algorithm.

A Cisco spokesman told WIRED that the algorithm was implemented in its standard crypto library around mid-2012, a library that is used in more than 120 product lines, but the algorithm is not the default, and the default algorithm cannot be changed by users. The company is currently completing an internal audit of all of its products that leverage the NIST standard.

RSA, however, made the algorithm the default in its BShare toolkit for Java and C developers until this week when it told WIRED that it was changing the default following the renewed controversy over it. The company sent an advisory to developer customers “strongly” urging them to change the default to one of a number of other random number generator algorithms RSA supports. RSA also changed the default on its own end in BSafe and in an RSA key management system. The company is currently doing an internal review of all of its products to see where the algorithm gets invoked in order to change those.

RSA actually added the algorithm to its libraries in 2004 or 2005, before NIST approved it for the standard in 2006 and before the government made it a requirement for FIPS certification, says Sam Curry, the company’s chief technology officer. The company then made it the default algorithm in BSafe and in its key management system after the algorithm was added to the standard. Curry said that elliptic curve algorithms were all the rage at the time and RSA chose it as the default because it provided certain advantages over the other random number generators, including what he says was better security.

“Cryptography is a changing field. Some algorithms go up and some come down and we make the best decisions we can in any point in time,” he says.”A lot of the hash-based algorithms were getting struck down by some weaknesses in how they chose numbers and in fact what kind of sample set they chose for initial seeding. From our perspective it looked like elliptic curve would be immune to those things.”

Curry says the fact that the algorithm is slower actually provides it with better security in at least one respect.

“The length of time that you have to gather samples will determine the strength of your random number generation. So the fact that it’s slower sometimes gives it a wider sample set to do initial seeding,” he says. “Precisely because it takes a little longer, it actually winds up giving you more randomness in your initial seeding, and that can be an advantage.”

Despite the renewed controversy over the algorithm and standard, Microsoft managers say they still don’t think the weaknesses constitute an intentional backdoor.

Callas agrees. He thinks it is simply bad cryptography that was included in the standard to round-out the selection so that there would be at least one elliptic curve algorithm in the standard.

But one advantage to having the algorithm supported in products like Vista — and which may be the reason the NSA pushed it into the standard — is that even if it’s not the default algorithm for encryption on a system, as long as it’s an option on the system, an intruder, like the NSA, can get into the system and change the registry to make it the default algorithm used for encryption, thereby theoretically making it easy for the NSA to undermine the encryption and spy on users of the machine.

Schneier says this is a much more efficient and stealth way of undermining the encryption than simply installing a keystroke logger or other Trojan malware that could be detected.

“A Trojan is really, really big. You can’t say that was a mistake. It’s a massive piece of code collecting keystrokes,” he said. “But changing a bit-one to a bit-two [in the registry to change the default random number generator on the machine] is probably going to be undetected. It is a low conspiracy, highly deniable way of getting a backdoor. So there’s a benefit to getting it into the library and into the product.”

To date, the only confirmation that the algorithm has a backdoor comes in the Times story, based on NSA documents leaked by Edward Snowden, which the Times and two other media outlets saw.

“[I]nternal memos leaked by a former NSA contractor, Edward Snowden, suggest that the NSA generated one of the random number generators used in a 2006 NIST standard — called the Dual EC DRBG standard — which contains a back door for the NSA,” the Times wrote.

An editorial published by the Times this weekend re-asserted the claim: “Unbeknown to the many users of the system, a different government arm, the National Security Agency, secretly inserted a ‘back door’ into the system that allowed federal spies to crack open any data that was encoded using its technology.”

But all of the quotes that the Times published from the memos refer to the NSA getting the standard passed by an international standards body; they do not say the NSA intentionally weakened the algorithm and standard, though the Times implies that this is what the memos mean by tying them to the 2007 presentation by Shumow and Ferguson.

NIST has denied any knowledge of a backdoor and has also denied that the NSA authored its standard. The institute has, however, re-opened the standard for public comment as a result of the controversy and “strongly” urged against using the algorithm in question until the matter could be resolved. The public comments period will close Nov. 6.

Even without more explicit confirmation that the weaknesses in the algorithm and standard constitute a backdoor, Kocher and Schneier believe they do.

“It is extraordinarily bad cryptography,” says Kocher. “If you look at the NSA’s role in creating standards [over the years] and its general cryptographic sophistication, none of it makes sense if there isn’t a backdoor in this.”

Schneier agrees and says the NSA has done too many other things for him to think, when he sees government-mandated crypto that’s weak, that it’s just by accident.

“If we were living in a kinder world, that would be a plausible explanation,” he says. “But we’re living in a very malicious world, it turns out.”

He adds that the uncertainty around the algorithm and standard is the worst part of the whole matter.

“This is the worst problem that the NSA has done,” Schneier says. “They have so undermined the fundamental trust in the internet, that we don’t know what to trust. We have to suspect everything. We’re never sure. That’s the greatest damage.”

Everything You Need to Know About Microsoft’s New Surfaces

Everything You Need to Know About Microsoft’s New Surfaces

The Surface 2 and Surface Pro 2. Photo: Andrew White/WIRED

NEW YORK CITY — Less than a year after its first serious foray into building its own hardware, Microsoft showed off the Surface 2, the Surface Pro 2, and several new accompanying peripherals at a New York City event.

Here’s everything you need to know about the new products.

Surface Pro 2

Photo: Andrew White/WIRED

None of the Surface Pro 2 updates are all that surprising. Indeed, the changes are fairly subtle, as Surface Product Manager Panos Panay repeatedly emphasized during the event. The most noticeable external difference is a new kickstand, integrated on both the Surface Pro 2 and Surface 2 that provides two positions instead of one. The new position supposedly makes it easier to use the Surface in your lap.

Beyond the new kickstand, most internal updates are incremental ones. The most important update to the Surface Pro 2 is its new Intel Haswell Core i5 chip. It extends battery life by 75 percent, according to Microsoft, putting it at around 7 to 8 hours, whereas the Surface Pro only got around four to five.

“Surface Pro 2 now lets you use it all day,” Panos Panay said. “When I say it’s the most productive and powerful professional tablet today, I mean it.”

Other minor changes include a speaker boost with built-in Dolby. The screen now has 50 percent more color accuracy, according to Panay, who also emphasized that Surface Pro 2 is faster than 95 percent of laptops available today. Users will also be able to choose between a couple of RAM configurations — 4GB and 8GB — along with the four available storage sizes: 64GB, 128GB, 256GB and 512GB. Everything else about the Surface Pro 2 is the same as the Surface Pro.

The Surface Pro 2 will start at $899. It will be available on October 22, but you can start pre-ordering September 24.

Surface 2

Photo: Andrew White/WIRED

The Surface 2 — Microsoft removed the ‘RT’ from the name — received more drastic changes. In addition to a new white casing, it comes equipped with the latest ARM-based Tegra 4 processor, making it a good deal faster than its predecessor. It also gets 25-percent more battery life — more than 12 hours a day, according to Panay — and comes with the fancier ClearType Full HD screen, which had previously only been available on the Surface Pro.

But as much as the specs have changed, the software is still at the core of the experience. “I can tell you all day: specs, specs, specs,” Panay said. “But how’s it better? What’s changed? What’s the difference?”

Since Surface RT, and now Surface 2, runs on Windows RT, it depends on Windows Store apps only. The app store has been struggling since its inception, but is finally getting some traction among developers. Microsoft announced that there are 100,000 apps in the Windows Store right now. Surface 2 will also come with the full Office suite, including Outlook, which was missing in the Surface RT Office suite.

Photo: Andrew White/WIRED

The Surface 2 camera also got an update. A new sensor allows it to shoot better video even under very dark lighting scenarios. It’s optimized for Skype video calls too (Surface 2 users get free international calling with Skype).

Panay added that when you buy one, you’ll receive 200GBs of SkyDrive for two years.

The Surface 2 starts at $449. It will be available on October 22, and pre-orders begin September 24. The Surface RT will remain on sale for $349.

Accessories

Photo: Andrew White/WIRED

More exciting than the refreshed computer-tablet hybrids are a handful of new Microsoft-built Surface peripherals. The company clearly wants to build a hardware ecosystem around its devices. And after teasing it last year, Microsoft announced a new Power Cover, a thick keyboard cover with a built-in battery pack to give the Surface Pro 2 an extra two-hour boost.

The company also showed off a new Surface docking station, compatible with both of the new devices, that allows you to add an external monitor via the Mini DisplayPort. The dock comes with Ethernet, audio in and out, one USB 3.0 port, and three 2.0 ports.

The Type and Touch Covers also got an update. They now come in a variety of new colors, are one millimeter thinner than the last versions, and have silent keys and a faster keyswitches for quick, responsive typing. Most importantly, the Type Cover 2 is now backlit. It dims on its own when you’re not using it, to save battery life. It won’t, however, be available until next year.

The Touch Cover 2 comes with the same backlighting and increased typing sensitivity and accuracy tweaks, according to Microsoft. “You will type on this thing and you will not miss words,” Panay said. You can also do gestures on the cover.

Photo: Andrew White/WIRED

There are also many new attachment possibilities. Microsoft showed off the Blade from the new Surface Remix Project. It’s a special add-on that features 12 large buttons, letting you DJ and remix songs. The sensor system can recognize pressure, so the harder you tap on a key, the louder the sound associated with it. It’s a totally new kind of keyboard attachment, and you can begin to imagine the niche products that can come from the technology. Think accessories specifically for painting or video game playing.

More than anything else, today’s event shows that Microsoft is committed to the hardware side of its business — its hybrid tablet-laptop Surface, in particular — despite little commercial success. Though reviewers had generally good things to say about the Surface RT and Surface Pro last year, it didn’t translate to significant sales for Microsoft. Sales through the end of June totaled only $853 million, according to a Form 10-K SEC filing from the company. That amounts to around 1.5 million Surface RTs, and only 400,000 Surface Pros, according to an earlier Bloomberg report.

Panos Panay with the Surface Pro 2. Photo: Andrew White/WIRED

Microsoft has faced several hurdles in marketing and selling its Surfaces. The main one is related to price. The Surface Pro cost upwards of $900, and nothing has really changed with the Surface Pro 2 (indeed, things have actually gotten a bit more expensive).

The Surface RT, on the other hand, was priced to compete against Apple’s iPad at $500 for the 32GB model without a Touch Cover. Even that didn’t work out. Microsoft ended up slashing the price this summer to $349, and will keep that device available for sale. The Surface 2 maintained a fairly low price, at only $100 more.

Keeping the price low on the Surface 2 may help make it slightly more appealing to the masses, but the RT-version of the device has a bit of an identity crisis. Though Microsoft distinguishes Windows RT as a lighter, more consumer-facing version of Windows, many people don’t understand the appeal of the OS. With Surface 2 comes several new compelling usages that Microsoft hopes will help win over more tablet users. Despite some initial confusion, the company is sticking to it’s hardware — and is already working on Surface three generations ahead.

BlackBerry Nears the End as Jobs Disappear and Losses Mount

BlackBerry Nears the End as Jobs Disappear and Losses Mount

RIM CEO Thorsten Heins

BlackBerry CEO Thorsten Heins, shown here during slightly happier times, had promised the company would overhaul its corporate culture. But so far, the effort hasn’t been enough. Photo: Official BlackBerry Images/Flickr

The ride is winding down. After a year during which investors first gave BlackBerry another chance, then threw up their hands, shares have plunged again, this time on the news that the company expects to report nearly $1 billion in losses during the second quarter.

The once-reigning master of mobile messaging also plans to lay off 4,500 employees, or more than one-third of its workforce.

BlackBerry shares finished the day down more than 17 percent on the NASDAQ, heading back toward the lows seen nearly a year ago. Optimists were hopeful that the long-delayed BlackBerry 10 OS might at least help the company regain a hold on the businesses of world, which have traditionally gravitated towards the software tools that let them secure and manage phones used across a large organization.

Instead, the company ran headlong into the bring-your-own-device wave. Corporate America realized that employees were using their own devices (read: iPhones and Android devices) for work anyway. Rather than fight their own workers every step of the way, businesses decided to figure out how to incorporate those devices into their own workflow. Large organizations regained some centralized control they had lost, while workers were happier and more productive.

Those productivity gains might have had something to do with the fact that their iPhones and Android devices were more effective than the handsets the dysfunctional BlackBerry was making.

The launch of the new iPhones today must make BlackBerry’s bad news all the more stinging for company employees, executives, and any shareholders still hanging on. But the story of the company’s decline is nearly as old as the iPhone itself. The best the company can likely hope for is a Nokia-style takeover, though who would actually take over is tough to imagine.

The other possibility is that a leaner BlackBerry will abandon the consumer market once and for all and get rigorous about becoming a niche company focused on a particular customer — one that needs security, control, and a few particular tasks done remarkably well, possibly even as a software rather than a hardware company. Now that the iPhone has gone gold, maybe BlackBerry could remake itself as stainless steel — not as valuable a metal, but strong, polished, and incorruptible.

BlackBerry's Steep Decline

BlackBerry’s fortunes in the U.S. took a steep drop starting around the time BlackBerry addict Barack Obama took office. Graphic: Marcus Wohlsen/Data: ComScore

RSA Tells Its Developer Customers: Stop Using NSA-Linked Algorithm

RSA Tells Its Developer Customers: Stop Using NSA-Linked Algorithm

Amidst all of the confusion and concern over an encryption algorithm that may contain an NSA backdoor, RSA Security released an advisory to developer customers today noting that the algorithm is the default in one of its toolkits and strongly advising them to stop using the algorithm.

The advisory provides developers with information about how to change the default to one of a number of other random number generator algorithms RSA supports and notes that RSA has also changed the default on its end in BSafe and in an RSA key management system.

The company is the first to go public with such an announcement in the wake of revelations by the New York Times that the NSA may have inserted an intentional weakness in the algorithm — known as Dual Elliptic Curve Deterministic Random Bit Generation (or Dual EC DRBG) — and then used its influence to get the algorithm added to a national standard issued by the National Institute of Standards and Technology.

In its advisory, RSA said that all versions of RSA BSAFE Toolkits, including all versions of Crypto-C ME, Micro Edition Suite, Crypto-J, Cert-J, SSL-J, Crypto-C, Cert-C, SSL-C were affected.

In addition, all versions of RSA Data Protection Manager (DPM) server and clients were affected as well.

The company said that to “ensure a high level of assurance in their application, RSA strongly recommends that customers discontinue use of Dual EC DRBG and move to a different PRNG.”

RSA is currently doing an internal review of all of its products to see where the algorithm gets invoked and to change those. A company spokesman said the review is expected to be completed next week.

“Every product that we as RSA make, if it has a crypto function, we may or may not ourselves have decided to use this algorithm,” said Sam Curry, chief technical officer for RSA Security. “So we’re also going to go through and make sure that we ourselves follow our own advice and aren’t using this algorithm.”

Curry told WIRED that the company added the algorithm to its libraries in 2004 and 2005 at a time when elliptic curve algorithms were becoming the rage and were considered to have advantages over other algorithms. The algorithm was approved by NIST in 2006 for a standard governing random number generators.

BSafe has six random number generators in it, some are hash-based and several that are elliptic-curve based, like the algorithm in question. Curry says they chose Dual EC DRBG as the default “on the basis of providing the best security for our customers.”

The algorithm he said had features that gave it advantages over the others.

“The ability to do continuous testing of output, for instance, or the ability to do general sort of prediction resistance and to be able to do re-seeding,” he said. “Those are really attractive features.”

The advisory to RSA developers reads as follows:

Due to the debate around the Dual EC DRBG standard highlighted recently by the National Institute of Standards and Technology (NIST), NIST re-opened for public comment its SP 800-90 standard which covers Pseudo-random Number Generators (PRNG).

For more information about the announcement see:

http://csrc.nist.gov/publications/PubsDrafts.html#SP-800-90-A%20Rev%201%20B%20and%20C

The ITL Security Bulletin mentioned in this announcement includes the following:

“Recommending against the use of SP 800-90A Dual Elliptic Curve Deterministic Random Bit Generation: NIST strongly recommends that, pending the resolution of the security concerns and the re-issuance of SP 800-90A, the Dual_EC_DRBG, as specified in the January 2012 version of SP 800-90A, no longer be used.”

The currently released and supported versions of the BSAFE libraries (including Crypto-J 6.1.x and Crypto-C ME 4.0.x) and of the RSA DPM clients and servers use Dual EC DRBG as the default PRNG, but most libraries do support other PRNGs that customers can use.  We are providing guidance to our customers on how to change the PRNG from the default in their existing implementation.

In the current product documentation, RSA has provided technical guidance for RSA BSAFE Toolkits and RSA DPM customers to change the PRNG in their implementation.

RSA will change the default RNG in RSA BSAFE Toolkits and RSA DPM as appropriate and may update the algorithm library as needed.