Quantcast
Channel: Processors – Michael Tsai
Viewing all 78 articles
Browse latest View live

The End of x86? An Update

$
0
0

Michael J. Fern (via Hacker News):

x86 shipments dropped by 9% in Q3 2012. Furthermore, the expected surge in PC sales (and x86 shipments) in Q4 due to the release of Windows 8 has failed to materialize. NPD data indicates that Windows PCs sales in U.S. retail stores fell a staggering 21% in the four-week period from October 21 to November 17, compared to the same period the previous year. In short, there is now falling demand for x86 processors. Computer buyers are shifting their spending from PCs to next generation computing devices, including smartphones and tablets.

It’s interesting to consider that Intel’s problem may be more with its business model than its technology. I think they’ll be able to make processors that are competitive in terms of performance and power, but what about in terms of dollars?

Meanwhile, Matt Mastracci has an interesting idea about how Apple might use ARM chips in Macs:

There is a fair bit of space on inside of a MacBook compared to an iPad or iPhone. Apple would use some of this space to drop one of the A5 chips on the motherboard next to the Intel chip, effectively [building] themselves a hybrid ARM/x64 system.

In other words, ARM can’t replace x86 on the desktop because it’s far too slow for certain tasks, particularly at running existing x86 software in emulation. But what if you could have an x86 that’s powered down most of the time? You could have a lot of battery life without sacrificing performance when the Mac is plugged in.


Apple’s Cyclone Microarchitecture Detailed

$
0
0

Anand Lal Shimpi:

With six decoders and nine ports to execution units, Cyclone is big. As I mentioned before, it’s bigger than anything else that goes in a phone. Apple didn’t build a Krait/Silvermont competitor, it built something much closer to Intel’s big cores. At the launch of the iPhone 5s, Apple referred to the A7 as being “desktop class” - it turns out that wasn’t an exaggeration.

Cyclone is a bold move by Apple, but not one that is without its challenges. I still find that there are almost no applications on iOS that really take advantage of the CPU power underneath the hood. More than anything Apple needs first party software that really demonstrates what’s possible. The challenge is that at full tilt a pair of Cyclone cores can consume quite a bit of power. So for now, Cyclone’s performance is really used to exploit race to sleep and get the device into a low power state as quickly as possible. The other problem I see is that although Cyclone is incredibly forward looking, it launched in devices with only 1GB of RAM. It’s very likely that you’ll run into memory limits before you hit CPU performance limits if you plan on keeping your device for a long time.

Skylake

$
0
0

Peter Bright:

As has been the case for many years now, reducing power consumption remains Intel’s top priority for Skylake. Not only does reduced power consumption enable the company’s processors to be used more widely—client Skylake processors will span everything from 4.5W tablet and ultralight systems up to 95W desktop devices, a 20-fold difference in power envelope—it also enables greater performance. Reduce the power used by one part of the chip and the extra thermal headroom (and current draw) can be spent on other parts of the chip; this is the underlying principle of Turbo Boost.

[…]

With this new design, the eDRAM is always coherent, since it is privy to all writes made to main memory, regardless of which core makes them. This also means that it can cache any data, even if it’s stored in memory that is marked as “uncacheable” by the operating system. The design also enables both PCIe devices and the display engine to read to and write from the cache.

[…]

Skylake has some “more of the same” aspects to its power conservation—more individual parts of the processor can have their frequency adjusted or powered down to allow finer tuning of power consumption—though these have been extended. For example, most code either never uses the AVX2 instruction set, or uses it extensively; it’s rare for applications to only use AVX2 every now and then. When faced with workloads that never use AVX2, those instruction units are powered down.

[…]

In Skylake, the power management is more cooperative. The operating system still has some control—for example, it can force a low frequency for extending battery life, or more commonly, it can set a range of acceptable frequencies—but the processor itself handles the rest. Rather than just choosing between P0 turbo states, the processor can pick between the full range of P states, from the minimum frequency all the way up to P0. […] This means that the processor is both quicker to react to new work, boosting the frequency as needed, but also much quicker to cut the frequency when idle.

Joe Rossignol:

Last month, a leaked Intel slide deck revealed that “Y” series Skylake processors appropriate for the 12-inch Retina MacBook will have up to 17% faster CPU performance, up to 41% faster Intel HD graphics and up to 1.4 hours longer battery life compared to current-generation Core M architecture.

Update (2015-09-02): Ian Cutress (comments):

All of the Core M processors are launching today, as are the i3/i5/i7 models and two new Xeon mobile processors. From a power perspective this means Intel is releasing everything from the 4.5W ultra-mobile Core M through the large 65W desktop models, along with the previously released 91W desktop SKUs.

nhaehnle:

The most interesting to me is that Intel apparently stopped publishing transistor counts starting with the 14nm node.

This is significant because as structure sizes become smaller, the restrictions on possible layouts (so called DRCs, design rule constraints) become ever stricter. For example, you can't just place wires wherever you want; you have to take into account the relationship to other wires. With stricter rules, the end result may be that the effective scaling achieved is worse than what the structure size suggests, because the rules force a lower density of transistors.

Joe Rossignol:

Intel announced a number of new 45-watt “H-Series” processors, but none with the higher-end Iris Pro graphics Apple uses in the 15" Retina MacBook Pro. Skylake H-Series chips with Iris Pro graphics are not expected to launch until early 2016, and Intel has yet to release detailed specs on these chips.

Apple’s Processor Advantage

$
0
0

Steve Cheney (comments):

One of Steve Jobs’ biggest legacies was his decision to stop relying on 3rd party semiconductor companies and create an internal silicon design team. I would go so far as to argue it’s one of the three most important strategic decisions he ever made.

[…]

It is – in fact – these chip making capabilities, which Jobs brought in-house shortly after the launch of the original iPhone, that have helped Apple create a massive moat between itself and an entire industry.

[…]

The truth is the best people in chip design no longer want to work at Intel or Qualcomm. They want to work at Apple. I have plenty of friends in the Valley who affirm this. Sure Apple products are cooler. But Apple has also surpassed Intel in performance. This is insane. A device company – which makes CPUs for internal use – surpassing Intel, the world’s largest chip maker which practically invented the CPU and has thousands of customers.

John Gruber:

I don’t think it has gotten through the heads of many people that Apple has now turned the old dynamic on its head. Apple’s ARM chips are years ahead of the commodity chips used by its competition, and are set to surpass even Intel’s x86 chips in terms of performance-per-watt.

[…]

We should clarify one point from Cheney’s headline — Apple’s lead is formidable, not insurmountable. Nothing in tech is insurmountable.

Dave Lee (comments):

The University of Wisconsin successfully claimed that Apple used its microchip technology without permission in some iPhones and iPads.

The patent, filed in 1998, is said to improve the power efficiency of microchips.

devit:

Looks like the “idea” of the patent in the description is to use a predictor to predict when a STORE and LOAD alias and not speculate the LOAD and any instruction depending on the load (although the claims generalize this to any non-static dependency).

As it generally happens in software/hardware patents, the claimed solution seems quite obvious whenever one wants to solve that particular problem, and the hard part is the “execution”, i.e. implementing it efficiently and figuring out whether the tradeoffs are worth it.

Joe Mullin:

In this case, WARF said the ’752 patent improves the A7, A8, and A8X chips Apple uses in newer iPhones and iPads. Now that the jury has found Apple liable, it will decide on damages; in earlier orders, US District Judge William Conley has written that the maximum in damages that can be claimed is $862.4 million. A third phase of the trial will determine whether Apple was a “willful” infringer; if so, damages could be tripled. If both the damage and wilfulness phases go poorly for Apple, it could be a record-breaking verdict.

John Gruber:

The more I think about it, the more sure I am that it’s wrong to call WARF a patent “troll”. They are a non-practicing entity, but a university almost has to be. Universities don’t produce commercial products, they conduct research. And WARF uses its patent royalties to fund research.

The Success of ARM

$
0
0

ChuckMcM:

What is fascinating is that Intel got into that position by being open, there were no fewer than 12 licensees for its 8086 design, and people had supplanted “expensive, proprietary lock-in” type architectures with more open and cheaper chips. It was the emergence of the PC market, and the great Chip Recession of 1984, where Intel decided if it was going to stay a chip maker, it had to be the best source of its dominant computer chips. I was at Intel at the time and it shifted from partnering, to competing, with the same people who had licensed its chips, with the intent of “reclaiming” the market for CPU chips for itself.

[…]

The relentless pace of putting more transistors into less space drove an interesting problem for ARM. When you get a process shrink you can do one of two things, you can cut your costs (more die per wafer), or you can keep your costs about the same and increase features (more transistors per die). And the truth is you always did a bit of both. But the challenge with chips is their macro scale parts (the pin pads for example) really couldn’t shrink. So you became “pad limited”. The ratio of the area dedicated to the pads (which you connected external wires too) and the transistors could not drop below the point where most of your wafer was “pad”. If it did so then you’re costs flipped and your expensive manufacturing process was producing wafers of mostly pads so not utilizing its capabilities.

[…]

So we had an explosion of “system on chip” products with all sorts of peripherals that continues to this day. And the process feature size keeps getting smaller, and the stuff added keeps growing. The ARM core was so small it could accommodate more peripherals on the same die, that made it cost effective and that made it a good choice for phones which needed long battery life but low cost.

Update (2015-12-03): mrpippy (via Twitter):

The advantages that Apple derives from the A-series SoCs is not due to any inherent advantage of ARM vs. x86, but because Apple has full control over the design and manufacturing.

Intel CPU Bugs of 2015

$
0
0

Dan Luu (via Hacker News and Peter Steinberger):

We’ve seen at least two serious bugs in Intel CPUs in the last quarter, and it’s almost certain there are more bugs lurking. Back when I worked at a company that produced Intel compatible CPUs, we did a fair amount of testing and characterization of Intel CPUs; as someone fresh out of school who’d previously assumed that CPUs basically worked, I was surprised by how many bugs we were able to find. Even though I never worked on the characterization and competitive analysis side of things, I still personally found multiple Intel CPU bugs just in the normal course of doing my job, poking around to verify things that seemed non-obvious to me. Turns out things that seem non-obvious to me are sometimes also non-obvious to Intel engineers. As more services move to the cloud and the impact of system hang and reset vulnerabilities increases, we’ll see more black hats investing time in finding CPU bugs. We should expect to see a lot more of these when people realize that it’s much easier than it seems to find these bugs. There was a time when a CPU family might only have one bug per year, with serious bugs happening once every few years, or even once a decade, but we seem to have moved past that. In part, that’s because “unpredictable system behavior” have moved from being an annoying class of bugs that forces you to restart your computation to an attack vector that lets anyone with an AWS account attack your cloud-hosted services, but it’s mostly because CPUs are now complex enough that they’ve become too complicated to test effectively. Ironically, we have hardware virtualization is supposed to help us with security, but the virtualization is so complicated4 that the hardware virtualization implementation is likely to expose “unpredictable system behavior” bugs that wouldn’t otherwise have existed.

Cupertino’s Chief Chipmaker, Johny Srouji

$
0
0

Bloomberg Businessweek (comments):

Srouji runs what is probably the most important and least understood division inside the world’s most profitable company. Since 2010, when his team produced the A4 chip for the original iPad, Apple has immersed itself in the costly and complex science of silicon. It develops specialized microprocessors as a way to distinguish its products from the competition. The Apple-designed circuits allow the company to customize products to perfectly match the features of its software, while tightly controlling the critical trade-off between speed and battery consumption. Among the components on its chip (technically called a “system on a chip,” or SOC) are an image signal processor and a storage controller, which let Apple tailor useful functions for taking and storing photos, such as the rapid-fire “burst mode” introduced with the iPhone 5s. Engineers and designers can work on features like that years in advance without prematurely notifying vendors—especially Samsung, which manufactures many of Apple’s chips.

[…]

A former Apple engineer who worked on the [original iPhone] said that while the handset was a breakthrough technology, it was limited because it pieced together components from different vendors, including elements from a Samsung chip used in DVD players. “Steve came to the conclusion that the only way for Apple to really differentiate and deliver something truly unique and truly great, you have to own your own silicon,” Srouji says. “You have to control and own it.”

What Is the Secure Enclave?

$
0
0

Mike Ash (comments):

Each iOS CPU is built with a 256-bit unique identifier or UID. This UID is burned into the hardware and not stored anywhere else. The UID is not only inaccessible to the outside world, but it’s inaccessible even to the software running at the highest privilege levels on the CPU. Instead, the CPU contains a hardware AES encryption engine, and the only way the UID can be accessed by the hardware is by loading it into the AES engine as a key and using it to encrypt or decrypt data.

Apple uses this hardware to entangle the user’s passcode with the device. By setting the device’s UID as the AES key and then encrypting the passcode, the result is a random-looking bunch of data which can’t be recreated anywhere else, since it depends on both the user’s passcode and the secret, unreadable, device-specific UID.

[…]

The Secure Enclave contains its own UID and hardware AES engine. The passcode verification process takes place here, separated from the rest of the system.

[…]

The escalating delays for failed passcode attempts are enforced by the Secure Enclave. The main CPU merely submits passcodes and receives the results. The Secure Enclave performs the checks, and if there have been too many failures it will delay performing those checks. The main CPU can’t speed things along.

[…]

This would be fairly easy to implement, and shouldn’t affect the usability of the device. Given Apple’s public stance on user privacy, I would find it extremely weird if it the Secure Enclave’s software update mechanism wasn’t implemented in this way. On the other hand, Tim Cook’s public letter implies that all iPhone models are potentially vulnerable, so perhaps they haven’t taken this extra step.

Previously: FBI Asks Apple for Secure Golden Key.


Andy Grove, RIP

$
0
0

Casey Newton:

Andy Grove, who fled from Nazi and Soviet oppression to become one of the most powerful business leaders in the global tech industry as the chairman and CEO of Intel, died on Monday. He was 79. The cause of death was not reported, though Grove was a longtime sufferer of Parkinson’s disease.

Intel:

Present at Intel’s 1968 founding with Robert Noyce and Gordon Moore, Andy Grove became Intel’s President in 1979 and CEO in 1987. He served as Chairman of the Board from 1997 to 2005. Both during his time at Intel and in retirement, Grove was one of the most influential figures in technology and business, writing best-selling books and widely cited articles, and speaking out on an array of prominent public issues.

Steve Johnson:

During his three decades with the Santa Clara corporation, the gruff and demanding Grove helped mold Intel into a multibillion-dollar Goliath and the world’s biggest semiconductor company. Along the way, he also became a prolific author, donated millions of dollars to charity and was lavished with awards, including being named Time magazine’s Man of the Year.

[…]

Grove fled to Austria at the age of 20 and, with $20 in his pocket, emigrated to the United States, where he changed his name from Grof to Grove, moved in with relatives and was accepted at City College of New York.

[…]

Finishing City College in 1960 at the top of his class with a bachelor’s degree in chemical engineering, he entered graduate school at UC Berkeley and arranged for his parents to leave Hungary and join him in California. After receiving a doctorate degree in chemical engineering in 1963, Grove landed a job with Silicon Valley chip pioneer Fairchild Semiconductor, where he became assistant director of research and development in 1967.

Ian King:

When Steve Jobs and Larry Ellison told Andy Grove he was the only person in Silicon Valley who they would willingly work for, he told them he wouldn’t have hired either because they were “a couple of flakes.”

He was at least half serious and didn’t crack a smile.

[…]

If Grove experienced fear when he came to Intel, that didn’t stop him from using it as a management technique. He influenced a generation of Intel executives who referred to planning meetings with him as a “Hungarian inquisition.”

“Mentoring with Andy Grove was like going to the dentist and not getting Novocain,” said Pat Gelsinger, a former Intel executive who went on to become CEO of VMWare Inc.

Jonathan Kandell:

The first major crisis was linked to the rise of cheaper, high-quality Japanese memory chips beginning in the late 1970s. Instead of cutting costs by laying off staff, Mr. Grove demanded that Intel employees work an extra two hours a day — for free. Almost simultaneously, Intel introduced an advanced chip, the i432 microprocessor, that the company claimed would reshape computing’s future.

Instead, it proved a disaster, running 5 to 10 times more slowly than competitors. Part of the problem, Mr. Grove conceded in a 2001 interview with Wired magazine, was that he initially failed to take microprocessors seriously enough. “I was running an assembly line designed to build memory chips,” he said. “I saw the microprocessor as a bloody nuisance.”

But with Mr. Grove at the helm, Intel soon made the transition from memory chip to microprocessor giant.

Ben Thompson:

That’s why the Grove decision that actually impresses me the most is Intel’s launch of the Celeron processor in 1998. Grove had been introduced to a then-relatively-unknown Harvard Business School professor named Clayton Christensen, who told him about research for an upcoming book (The Innovator’s Dilemma) that explained how companies in their pursuit of margin allowed themselves to be beat on the low-end. Grove took the lesson to heart and directed Intel to create a low-end processor (Celeron) that certainly cannibalized Intel’s top-of-the-line processor to an extent but also dominated the low-end, quickly gaining 35% market share.

Update (2016-04-02): Ken Segall:

Intel’s huge leap in marketing came with the “Intel Inside” campaign. Though it’s grown incredibly tired today, this campaign does hold a place of honor in technology marketing history. It was by advertising the processor inside the PC as a consumer product that Intel became the global powerhouse it is today. It was a huge, bold leap.

Intel’s then-marketing chief, Dennis Carter, has always received credit for the birth of this campaign. But Fortune has a very nice article about Andy Grove (recommended reading), and they report that it was Andy who put his weight behind the campaign when others objected. That’s certainly a feather in his marketing cap.

Intel Splits on Atom

$
0
0

Daniel Eran Dilger:

Intel initially intended for Atom to scale down its legacy Wintel desktop x86 processor architecture for use in efficient mobile devices such as phones and tablets, but that strategy has been effectively abandoned as the chipmaker now moves to refocus its sights on modems, data center, Internet of Things and memory chips.

[…]

The move kills Intel’s once enthusiastic plans to muscle its way back into smartphone devices after first fumbling the ball in 2006, when its former chief executive Paul Otellini overlooked the prospect of supplying chips for Apple’s original iPhone as not worth doing.

See also: Hacker News, Slashdot.

Update (2016-05-04): See also: The Talk Show.

A2: Analog Malicious Hardware

$
0
0

Kaiyuan Yang et al. (PDF) (via Brendan O’Connor):

While the move to smaller transistors has been a boon for performance it has dramatically increased the cost to fabricate chips using those smaller transistors. This forces the vast majority of chip design companies to trust a third party— often overseas—to fabricate their design. To guard against shipping chips with errors (intentional or otherwise) chip design companies rely on post-fabrication testing. Unfortunately, this type of testing leaves the door open to malicious modifications since attackers can craft attack triggers requiring a sequence of unlikely events, which will never be encountered by even the most diligent tester.

In this paper, we show how a fabrication-time attacker can leverage analog circuits to create a hardware attack that is small (i.e., requires as little as one gate) and stealthy (i.e., requires an unlikely trigger sequence before effecting a chip’s functionality).

Andy Greenberg:

In fact, researchers at the University of Michigan haven’t just imagined that computer security nightmare; they’ve built and proved it works. In a study that won the “best paper” award at last week’s IEEE Symposium on Privacy and Security, they detailed the creation of an insidious, microscopic hardware backdoor proof-of-concept. And they showed that by running a series of seemingly innocuous commands on their minutely sabotaged processor, a hacker could reliably trigger a feature of the chip that gives them full access to the operating system. Most disturbingly, they write, that microscopic hardware backdoor wouldn’t be caught by practically any modern method of hardware security analysis, and could be planted by a single employee of a chip factory.

[…]

The “demonically clever” feature of the Michigan researchers’ backdoor isn’t just its size, or that it’s hidden in hardware rather than software. It’s that it violates the security industry’s most basic assumptions about a chip’s digital functions and how they might be sabotaged. Instead of a mere change to the “digital” properties of a chip—a tweak to the chip’s logical computing functions—the researchers describe their backdoor as an “analog” one: a physical hack that takes advantage of how the actual electricity flowing through the chip’s transistors can be hijacked to trigger an unexpected outcome.

mov Is Turing-complete

$
0
0

Stephen Dolan (PDF, via Emily St.):

It is well-known that the x86 instruction set is baroque, overcomplicated, and redundantly redundant. We show just how much fluff it has by demonstrating that it remains Turing-complete when reduced to just one instruction.

The instruction we choose is mov, which can do both loads and stores. We use no unusual addressing modes, self-modifying code, or runtime code generation. Using just this instruction (and a single unconditional branch at the end of the program to make nontermination possible), we demonstrate how an arbitrary Turing machine can be simulated.

movfuscator:

The M/o/Vfuscator (short ‘o’, sounds like “mobfuscator”) compiles programs into “mov” instructions, and only “mov” instructions. Arithmetic, comparisons, jumps, function calls, and everything else a program needs are all performed through mov operations; there is no self-modifying code, no transport-triggered calculation, and no other form of non-mov cheating.

Update (2016-08-29): Rosyna Keller:

As is xor

iPhone 7 Notes

$
0
0

Matthew Panzarino:

Every time you take a picture with the iPhone 7, both the wide angle and telephoto fire off. Yes, two 12 megapixel pictures for every shot. This could be a prime driver behind the increase of the iPhone 7 Plus’ memory to 3GB.

Both images are needed due to an Apple technique it is calling “fusion” internally. Fusion takes data from both sensors and merges them into the best possible picture for every condition. If, for instance, there is a low-light scene that has some dark areas, the image-processing chip could choose to pick up some image data (pixels or other stuff like luminance) from the brighter f1.8 wide angle and mix it in with the data from the f2.8 telephoto, creating a composite image on the fly without any input from the user. This fusion technique is available to every shot coming from the camera, which means that the iPhone 7 Plus is mixing and matching data every time that trigger is tapped.

John Gruber:

In my testing I didn’t see any noticeable difference between 1× shots on the iPhone 7 and 7 Plus. I think this “fusion” stuff only kicks in, or at least mostly kicks in, once you start increasing the zoom level. Put another way, I think the wide angle lens assists the telephoto lens more than the telephoto lens assists the wide angle.

Riccardo Mori:

Of all the new camera improvements in the iPhone 7 and 7 Plus — as ingenious as the dual camera system is on the bigger iPhone — my favourite is the flicker sensor. As Schiller explained, The flicker sensor reads the flickering of artificial lighting and can compensate for it in the photos and videos you take. I take a lot of indoor photos, and the flickering can be very annoying, especially when you want to include the source of artificial light in the frame. If this works as advertised, indoor photos and videos taken under artificial light will definitely look better, probably with more natural tones.

John Gruber:

Here’s the genius of the black and (especially) jet black iPhones 7. In a very seductive way, they look like something new and desirable. And at the same time, they are instantly recognizable as iPhones.

John Gruber (Hacker News):

After just five days — more than half of which I’ve spent using the matte black iPhone 7 Plus — this jet black iPhone 7 has a few “micro abrasions”, to use Apple’s own term. I can only see them when I’m looking for them, and only when I reflect light off the surface at the perfect angle, but they’re there. This is after two days of careful use, and never putting it in a pocket that contains anything else. The back surface of this phone shows more wear after (effectively) two days of use than my space gray 6S does after nearly a year.

That said, the unblemished back of the 6S looks downright boring. The jet black back of this iPhone 7 looks glorious.

[…]

The iPhone 7 now has OIS, for both stills and video. It works great. Side-by-side with my old iPhone 6S, I got noticeably better photos at an outdoor family gathering at dusk. I got noticeably better photos shooting indoors at night. And video shot while walking around is noticeably more stable and fluid. OIS does exactly what it says on the tin.

[…]

The new home buttons don’t feel like actual button clicks at all. It feels like the iPhone is clicking, not the button.

[…]

The new Taptic Engine is cool. Here’s my favorite use so far: the spinner control for things like picking a date or time (say, setting an alarm in the Clock app) now feels like a real spinner. It’s uncanny. I can’t wait to see how developers use these APIs.

Jason Snell:

I found that the Jet Black model indeed felt much more grippable than other iPhone 7 or iPhone 6 colors. Imagine placing a slightly damp finger on an iPhone screen, and how much harder it is to swipe your damp finger along that screen. That’s what’s going on with the Jet Black phone: even a little dampness on your fingers will cause them to skid along the surface, while it might slide right over the rougher anodized aluminum surfaces of the other colors.

[…]

I wouldn’t recommend you start using the iPhone for underwater photography—and Apple cautions that water invasion can void your warranty. But if you should get an iPhone 7 a little wet, everything will be okay.

[…]

When I first felt the new home button, I was really disappointed. The vibration felt halfhearted, and it made the act of pushing the home button feel like a letdown. I shouldn’t have worried: Apple actually offers three different levels of vibration in the new Home Button entry in the Settings app. And the most aggressive of those three levels worked great for me. No, the feel’s not the same as the old moving home button, but I managed to get used to it after about three button presses.

David Pogue:

When your phone is locked up, you can no longer hold down the Sleep + Home buttons to force-restart it. Instead, you’re now supposed to use Sleep + Volume Down, just as on many Android phones.

Gus Lubin (via Mike Rundle):

There’s an old rumor that iPhone home buttons break easily, and it’s causing millions of people to use an obscure accessibility feature called AssistiveTouch to avoid pressing them.

Juli Clover:

iPhone 7 and 7 Plus users are going to have a tough time unlocking their devices during wintertime. As it turns out, the new “solid-state” Home button on the iPhone 7 and iPhone 7 Plus requires skin contact or the right kind of capacitive gloves to function. […] And because the iPhone 7 uses the redesigned Lock screen in iOS 10, there's no quick and easy way to bring up the passcode entry screen to unlock the phone manually[…]

Raymond M. Soneira (via Craig Hockenberry):

The display on the iPhone 7 is a Truly Impressive Top Performing Display and a major upgrade and enhancement to the display on the iPhone 6. It is by far the best performing mobile LCD display that we have ever tested, and it breaks many display performance records.

Paul Miller and Dieter Bohn:

As pictured above, you can see a piece of plastic sits behind the ingress protection (waterproofing!), right where the headphone jack would have been. And (update!) according to Apple it’s a “barometric vent.” Apparently adding all the waterproofing to the iPhone 7 and 7 Plus meant that it was more of a sealed box, and so to be able to have an accurate and working barometer, Apple used that space. The barometer is the thing that allows a phone to measure altitude, and Apple points out that on the iPhone 7 it can measure even minor changes like climbing a flight of stairs.

Juli Clover:

IPx7, the water resistance rating, means the iPhone 7 can withstand immersion in water to one meter (3.3 feet) for 30 minutes, tested in laboratory conditions. IPx7 is the second-highest rating, below IP8, which indicates an ability to withstand long periods of immersion under pressure. Samsung's devices, by the way, are rated at IP68, suggesting better overall water resistance.

Apple has said that removing the headphone jack helped Apple meet IP67, however Samsung’s IP68 Galaxy S7 does have a headphone jack.

Chipworks (MacRumors):

We have revised our first A10 floorplan with help from our friends at AnandTech in the search for the small, high-efficiency cores. Our combined guess is that it is likely they are indeed integrated within the CPU cluster next to the big, high-performance cores. This makes sense given the the distinct colour of the small cores indicating a different digital library, and the position of the big core L1.

John Gruber (MacRumors):

Looking at Geekbench’s results browser for Android devices, there are a handful of phones in shouting distance of the iPhone 7 for multi-core performance, but Apple’s A10 Fusion scores double on single-core. […] The iPhone 7 scores better on both single- and multi-core than most MacBook Airs ever made, and performs comparably to a 2013 MacBook Pro.

John Gruber:

The iPhone has all the benefits (in short: superior design) that would keep me, and I think most other iPhone users, on the platform even if it didn’t have a performance advantage. But it does have a significant performance advantage, and it is exclusive to Apple. This is an extraordinary situation, historically. And year-over-year, it looks like Apple’s lead is growing, not shrinking. It’s not a fluke, but a sustained advantage.

Mark Sullivan:

Why? For a long time iPhones were one-size-fits-all-networks phones, but the iPhone 7 and iPhone 7 Plus each come in two different versions (or SKUs, in industry-speak), one with an Intel modem chip inside and one with a Qualcomm modem. The Intel 7630 modem doesn’t work with Sprint’s and Verizon’s 3G CDMA networks, so all Sprint and Verizon customers will get an iPhone 7 with a Qualcomm chip inside. For everyone else, the iPhone 7 could have either an Intel or a Qualcomm modem.

[…]

The end game for Apple may be to work with Intel to co-design a future system on a chip (SoC) that includes the modem, Apple Ax CPU, GPU, and many other components on one chip. This unified design can reduce the space the chips take up inside the phone, reduce the heat they emit, and reduce the power they require. The whole thing may be fabricated at Intel fabrication facilities.

AppleInsider (MacRumors):

In most cases, users claim the EarPods’ volume and call answer/end buttons become unresponsive after a few minutes of inactivity. Audio continues to play, and the microphone remains active, but users are unable to adjust volume settings, start or stop calls, or invoke Siri with the embedded remote.

Mitchel Broussard:

On the MacRumors forums, mentions of a “buzzing” and “static” sound coming from the back of the iPhone 7 and iPhone 7 Plus began on Friday afternoon. […] As pointed out by The Verge, the consensus of the noise’s origin online is that it’s caused by a phenomenon known as “coil noise.”

Oscar Raymundo:

So, you’ve unboxed your brand-new iPhone 7 or 7 Plus. It’s all set up, and it still has that fresh, new iPhone smell. Then, you hear a weird hissing noise. Or realize that the Home button or the Lightning EarPods are acting funky. Or your iPhone 7 is just not connecting to the cellular network. Yikes!

Yes, those are actual glitches that have already been reported by iPhone 7 users. If you’re experiencing a similar technical issue with your new device, take a deep breath and check out some possible solutions below.

David Steele:

Let’s take a look at seven features that Apple are belatedly bringing to the market and then consider why this is a good thing.

See also: more reviews.

Previously: iPhone 7.

Update (2016-09-24): James Thomson:

The lightning headphone adaptor sucks - walking about town listening to podcasts, it went dead 4-5 times and I needed to unplug / replug.

Yes, this was on 10.0.2, and I was actively listening to stuff and the audio just stopped.

Intel Core i7-7700K Kaby Lake Review

$
0
0

Mark Walton (Hacker News):

The Intel Core i7-7700K is what happens when a chip company stops trying. The i7-7700K is the first desktop Intel chip in brave new post-”tick-tock” world—which means that instead of major improvements to architecture, process, and instructions per clock (IPC), we get slightly higher clock speeds and a way to decode DRM-laden 4K streaming video.

There are apparently power consumption improvements, however.

Dan Luu:

Dear linkbait authors, I’m pretty sure Intel is trying. If you look at research $, they appear to be trying exponentially harder over time.

Marcel Weiher:

We are in an effective post-Moore’s law world, and have been for a couple of years. Yes, we can still put more transistors on the chip, but we are pretty much done with single core performance, at least until some really big breakthrough.

[…]

Most of the things that go into squandering CPU don’t parallelize well, so removing the bloat is actually starting to become cheaper again than trying to combat it with more silicon.

Lloyd Chambers:

[There] are at least a few reasons why the Intel ‘Kaby Lake’ release is significant:

  • Improved graphics performance.
  • My understanding is that the Kaby Lake ‘H’ series supports 32GB memory, thus making a MacBook Pro with 32GB of DR 23000 DRAM possible. But whether the power draw is viable on a laptop is unclear (meaning what we could expect from Apple, given the rationalizations seen with the Nov 2016 MacBook Pro).
  • The i7-7920HQ 3.1 GHz (turbo boost to 4.1 GHz, 4 real CPU cores) might be suitable for a MacBook Pro.
  • The i7-7700K 4.2 GHz (turbo boost 4.5 GHz) shoudl be suitable for an iMac. This perhaps is the “standstill” point—that’s only 5% faster than the 4.0 GHz iMac 5K that sits on my desk today—at the cost of a 95 watt TPD.

Paul Haddad:

Every PC manufacturer today announced Kaby Lake updates. I’m guessing Apple will wait until at least April to announce a MacBook with one.

ARM Mac Notebook Rumors

$
0
0

Mark Gurman:

Apple Inc. is designing a new chip for future Mac laptops that would take on more of the functionality currently handled by Intel Corp. processors, according to people familiar with the matter.

[…]

Apple engineers are planning to offload the Mac’s low-power mode, a feature marketed as “Power Nap,” to the next-generation ARM-based chip. This function allows Mac laptops to retrieve e-mails, install software updates, and synchronize calendar appointments with the display shut and not in use. The feature currently uses little battery life while run on the Intel chip, but the move to ARM would conserve even more power, according to one of the people.

This doesn’t make a whole lot of sense to me. It just doesn’t seem like it would be worth it as described.

I’m more intrigued by this Slashdot comment by Anonymous Coward:

Apple already has several ARM powered laptops drifting around internally. I’ve seen several of them with my own eyes. There’s at least five different prototypes, all constructed in plastic cases with varying degrees of complexity (some are literally just a clear acrylic box, others look more like 3D printed or milled parts designed to look like a chunky MBA or iBook).

[…]

All of them boot encrypted and signed OS images, which are fully recoverable over the internet so long as you’ve got WiFi access (similar to how their Intel powered systems do it). You cannot chose a version of the OS to load, you get whatever the latest greatest one is and that’s it. They’ve completely ported OS X to ARM (including all of Cocoa and Aqua), however a ton of utilities that normally come with OS X are missing (there’s no Disk Utility, Terminal, ColorSync, Grapher, X11, Audio/MIDI setup, etc). A lot of that functionality has been merged into a new app called “Settings” (presumably to match the iOS counterpart), which takes the place of System Preferences.

Likewise, App Store distribution appeared to be mandatory. […] The filesystem seemed a bit… peculiar, to say the least. Everything was stored in the root of the disk drive—that is to say, the OS didn’t support multiple users at all, and everything that you’d normally see in your home directory was presented as / instead. I don’t think the physical filesystem was actually laid out like this, it’s just that the Finder and everything else had been modified to make you believe that’s the way the computer worked. There was no /Applications folder anymore, your only option for launching and deleting apps was through Launchpad.

The problem with the “dump Intel for ARM” idea is that it wouldn’t work at the high end. ARM isn’t competitive there, some people really want x86 compatibility, and emulation doesn’t seem feasible. Even Apple wouldn’t alienate its customers with that sort of a switch. But what if the plan is to bifurcate the Mac line? A line of locked down ARM Macs and a line of Pros that really do look Pro in comparison?

The ARM Macs would simply drop support for all the old software. Intel-based Macs would still be around for development and other high-end users who are willing to pay more, but Apple’s focus would be on the At Ease line. It would be a middle ground between iOS and Mac: more powerful than an iPad Pro with a keyboard, and limited to apps from the Mac App Store so that it’s harder to screw up than a regular Mac. This sounds like a crazy rumor, but there is a certain logic to it.

That said, my personal bets are:

  • This is not Apple’s plan.
  • Apple does have a version of macOS running on ARM internally.
  • Switching Macs to ARM will not make sense in the foreseeable future. The advantages of the ARM instruction set diminish as chips grow.
  • Bifurcating the Mac line is not worth the engineering effort or customer confusion just to make some slightly lower power laptops. And if you want a locked down device, Apple already has iOS, which it intends to extend into that middle ground.
  • I consider it much more likely that we’ll see an iOS device with a built-in keyboard.

Update (2017-02-03): ATP Tipster:

Allow me to take a moment and shoot down that Slashdot ARM Mac post. Total bullshit.


How Apple Won Silicon

$
0
0

Rene Ritchie (via John Gruber):

The Apple A10 Fusion system-on-a-chip (SOC) in iPhone 7 mops the floor with both the Samsung Exynos 8895 and Qualcomm Snapdragon 835 found in the Galaxy S8 when it comes to single threaded operations.

[…]

Apple’s platform technologies team doesn’t have to worry about being hobbled or constrained in any way — all they have to do is run iOS and iOS apps faster than anything else on the planet. That’s their only customer.

It makes for an incredibly appealing work environment for legends of the industry and the best and brightest new minds, a startling number of whom have now found a home at Apple.

[…]

Conversely, Apple’s silicon team also doesn’t have to carry the baggage of competing vendors and devices. For example, Apple A10 doesn’t have to support Microsoft’s Direct X. It only and exactly has to support Apple’s specific technologies and implementations.

My iPhone SE, almost two-year-old technology, still feels pretty fast. The slowest parts for me are Touch ID and the cell network, neither of which is limited by the processor. So flip side of this story is that Apple will need more than faster processors now to entice people to upgrade their phones. Alas, on the Mac it’s the opposite story: I feel like I need more speed but that it’s simply not available.

Intel to Integrate Thunderbolt 3, Eliminate Royalties

$
0
0

Joe Rossignol:

Intel today announced that it plans to drive large-scale mainstream adoption of Thunderbolt by releasing the protocol’s specification to the industry next year under a nonexclusive, royalty-free license.

[…]

Intel also revealed plans to integrate Thunderbolt 3 into its future CPUs, but it didn’t provide a timeline as to when. The all-in-one design will take up less space on a Mac or PC’s logic board, and reduce power consumption by eliminating the need for a standalone Thunderbolt controller.

Hopefully this isn’t too late to avoid a FireWire-like fate.

Update (2017-05-31): Colin Cornaby:

FWIW I think Thunderbolt 3 is seeing a lot more success on the PC side than Firewire ever did, and it’s still growing.

So that’s the weird thing is that the PC companies have been shipping a whole bunch of docks. Meantime the Mac is a mess.

The higher end PC docks have GPUs, which isn’t supported under macOS. Apple blocked certain chipsets. Other stuff seems glitch on macOS.

The 2017 iMacs

$
0
0

Andrew Cunningham:

At a high level, single- and multi-core CPU performance has increased by around 40 percent since 2012, or by somewhere between 50 and 60 percent if you go back to 2011. Much of that comes from architectural improvements, but the clock-speed boosts deserve some of the credit, too.

[…]

The decision to choose AMD matters because, while its chips aren’t completely uncompetitive and offer a solid value for the price, they generally offer less performance per watt than contemporaneous GPUs from Nvidia. That’s a problem in the iMac especially, since you can’t just add cooling capacity for the sake of boosting performance.

[…]

But if you want to go with a pure SSD or increase your capacity, you’ll pay dearly: upgrading from the standard 2TB Fusion Drive in the top-end iMac to a 512GB SSD costs $200, a 1TB SSD costs $600, and a 2TB SSD costs $1,400.

If you can pay that price, though, you’re getting some of the fastest SSDs that anyone will sell you in any computer. Apple has been ahead of the curve on SSDs since it began moving away from SATA drives to PCI Express drives in 2013, long before anyone else thought to do it. The company has continued to extend its lead by adding more and more PCIe bandwidth and aggressively adopting standards like NVMe.

Apple also charges a lot for RAM, but on the 5K it’s user-replaceable (unlike on the iMac Pro), so you can add your own. Apple asks $600 to upgrade the high-end iMac 5K from 8 GB to 32 GB, but Crucial has a 32 GB kit for $260, and there are likely better deals to be found. Since there are four slots, you can keep the 8 GB and end up with 40 GB total.

Third-party external SSDs are also much cheaper if you want to add storage later, though the performance is likely worse than on the internal SSD.

Matthias Gansrigler:

Just did a quick test duplicating a 3.58 GB zip file in Finder.

#iMac: about 2 seconds

#rMBP2012: about 18 seconds

Nick Heer:

This situation feels like a repeat of the longstanding 16 GB entry-level capacity for iOS devices: it’s clearly inadequate. I don’t know what hardware Apple’s executive team uses, but I doubt any of them could honestly recommend that someone should buy an iMac today with a spinning hard drive. Solid state storage might be far too expensive to put in every iMac, but they could at least start with a Fusion Drive which, yes, would eat into margins, but it would be the right thing to do.

Bug in Skylake and Kaby Lake Hyper-threading

$
0
0

Henrique de Moraes Holschuh:

This advisory is about a processor/microcode defect recently identified on Intel Skylake and Intel Kaby Lake processors with hyper-threading enabled. This defect can, when triggered, cause unpredictable system behavior: it could cause spurious errors, such as application and system misbehavior, data corruption, and data loss.

It was brought to the attention of the Debian project that this defect is known to directly affect some Debian stable users (refer to the end of this advisory for details), thus this advisory.

Please note that the defect can potentially affect any operating system (it is not restricted to Debian, and it is not restricted to Linux-based systems). It can be either avoided (by disabling hyper-threading), or fixed (by updating the processor microcode).

Due to the difficult detection of potentially affected software, and the unpredictable nature of the defect, all users of the affected Intel processors are strongly urged to take action as recommended by this advisory.

Via Tom Harrington:

Check your Mac CPU with “sysctl machdep.cpu” and compare to this. [Skylake list, Kaby Lake list]

Developers who are concerned can use Instruments to disable hyperthreading until reboot. See Instruments prefs.

Unfortunately, the “Hardware Multi-Threading” setting in Instruments does not persist after the Mac reboots or sleeps, so you have to keep re-applying it. The good news is that Apple should be able to offer a software update that applies Intel’s microcode patch.

The MacBook Adorable

$
0
0

Casey Liss:

To me, the real bummer is the lack of USB-C power passthrough on most USB-C devices available for sale today. As an example, when I attempted to do my initial Time Machine backup, I did so via the Ethernet dongle. However, I had to ensure the machine didn’t sleep, since it was on battery power. Furthermore, I had to stress out about whether or not it would complete the initial backup before the battery gave up, since I had no way to power the MacBook and have it connected via Ethernet.

[…]

I opted to get a maxed-out MacBook Adorable. It has the don’t-call-it-a-m7 i7 processor, 16 GB of RAM, and a half-terabyte SSD. For such a small computer, it was far from cheap, at around $2000.

Intel’s naming seems to be almost intentionally confusing. In this case, i7 means that it’s the high-end version of the low-power Core M processor. The i7 line name goes all the way from 3.5 W in the MacBook to 91 W in the iMac. This dual-core 1.4 GHz processor is slower than the i7 in the 2009 MacBook Pro and even the i5 in the MacBook Air.

Viewing all 78 articles
Browse latest View live




Latest Images