The real story behind the Apple-Samsung smartphone patent war: Best of the Web

One of the greatest (and perhaps most expensive) battles in the tech industry over recent years has been the ongoing patent war between Apple and Samsung.

Now Kurt Eichenwald from Vanity Fair presents the behind-the-scenes story of this epic series of lawsuits between two of the biggest names in the tech industry:  

Steve Jobs, Apple’s mercurial chief executive, was furious. His teams had toiled for years creating a breakthrough phone, and now, Jobs fumed, a competitor—an Apple supplier no less!—had stolen the design and many features. Jobs and Tim Cook, his chief operating officer, had spoken with Samsung president Jay Y. Lee in July to express their concern about the similarities of the two phones but received no satisfactory response.

After weeks of delicate dancing, of smiling requests and impatient urgings, Jobs decided to take the gloves off. Hence the meeting in Seoul. The Apple executives were escorted to a conference room high in the Samsung Electronics Building, where they were greeted by about half a dozen Korean engineers and lawyers. Dr. Seungho Ahn, a Samsung vice president, was in charge, according to court records and people who attended the meeting. After some pleasantries, Chip Lutton, then Apple’s associate general counsel for intellectual property, took the floor and put up a PowerPoint slide with the title “Samsung’s Use of Apple Patents in Smartphones.” Then he went into some of the similarities he considered especially outrageous, but the Samsung executives showed no reaction. So Lutton decided to be blunt.

“Galaxy copied the iPhone,” he said.

And thus the patent wars began

Why Ruby on Rails is becoming one of the three R’s of education

What skills do kids need to survive and get ahead in the world?

Back in the days of industrial production lines, the three R’s formed the bedrock of a good education.

However, in this age of social media, web apps, mobile devices and ubiquitous embedded online devices, should we focus more on a fourth R: Ruby on Rails?

The importance of knowing to code as a part of basic literacy is an idea examined by Matt Richtel of the New York Times:

Seven-year-old Jordan Lisle, a second grader, joined his family at a packed after-hours school event last month aimed at inspiring a new interest: computer programming.

“I’m a little afraid he’s falling behind,” his mother, Wendy Lisle, said, explaining why they had signed up for the class at Strawberry Point Elementary School.

The event was part of a national educational movement in computer coding instruction that is growing at Internet speeds. Since December, 20,000 teachers from kindergarten through 12th grade have introduced coding lessons, according to Code.org, a group backed by the tech industry that offers free curriculums. In addition, some 30 school districts, including New York City and Chicago, have agreed to add coding classes in the fall, mainly in high schools but in lower grades, too. And policy makers in nine states have begun awarding the same credits for computer science classes that they do for basic math and science courses, rather than treating them as electives.

Despite the speed with which the idea is catching on, there are still some crucial unanswered questions:

The spread of coding instruction, while still nascent, is “unprecedented — there’s never been a move this fast in education,” said Elliot Soloway, a professor of education and computer science at the University of Michigan. He sees it as very positive, potentially inspiring students to develop a new passion, perhaps the way that teaching frog dissection may inspire future surgeons and biologists.

But the momentum for early coding comes with caveats, too. It is not clear that teaching basic computer science in grade school will beget future jobs or foster broader creativity and logical thinking, as some champions of the movement are projecting. And particularly for younger children, Dr. Soloway said, the activity is more like a video game — better than simulated gunplay, but not likely to impart actual programming skills.

So should our kids be taught how to code at school? Leave your thoughts in the comments below!

Why the age of the startup is killing American entrepreneurialism

You would have to live under a rock not to have noticed the recent excitement around Silicon Valley’s startup scene.

So it might be a little surprising at first to read the claim, made by Derek Thompson in The Atlantic, that American entrepreneurship is under attack like never before:

For entrepreneurs in America, it is the best of times, and it is the worst of times. It is “the age of the start-up,” and “American entrepreneurship is plummeting.” We are witnessing the Cambrian Explosion of apps and the mass extinction of apps. These are the glory days of risk, and we are taking fewer risks than ever. Tech valuations are soaring, and tech valuations are collapsing, and tech valuations are irrelevant. “A million users” has never been more attainable, and “a million users” has never been more meaningless. It is the spring of hope. It is the winter of despair.

It turns out that the problem is not in Silicon Valley, it’s on Main Street.

It seems, in the age of superstar tech entrepreneur, the willingness of many Americans to start a small business is waning:

What’s melting, exactly? Not the kids’ apps, but the mom-and-pop stores. Derek’s Coffee and Thompson’s Corner Store would be considered start-ups. But a new Starbucks or Whole Foods is considered part of an existing franchise. So as chains have expanded by more than 50 percent since 1983—Walmart gobbles up smaller competition with a particularly greedy appetite—start-ups have perished, as Jordan Weissmann has explained. The demise of small new companies isn’t limited to retail. Construction and manufacturing start-ups have collapsed by more than 60 percent in the last four decades.

The reasons behind this decline are interesting.

Why Fortran isn’t dead yet

Back in the 1960s and 1970s, archaic programming languages such as FORTRAN and COBOL ruled the roost.

These days, it can be easy to assume that more modern computer languages – such as Ruby on Rails, Javascript, Objective C, C++, Python or PHP – are the clear choice of developers.

However, as Lee Phillips at Ars Technica points out, this is not necessarily the case everywhere:

Take a tour through the research laboratories at any university physics department or national lab, and much of what you will see defines “cutting edge.” “Research,” after all, means seeing what has never been seen before—looking deeper, measuring more precisely, thinking about problems in new ways.

A large research project in the physical sciences usually involves experimenters, theorists, and people carrying out calculations with computers. There are computers and terminals everywhere. Some of the people hunched over these screens are writing papers, some are analyzing data, and some are working on simulations. These simulations are also quite often on the cutting edge, pushing the world’s fastest supercomputers, with their thousands of networked processors, to the limit. But almost universally, the language in which these simulation codes are written is Fortran, a relic from the 1950s.

According to Phillips, the reason for FORTRAN’s continued existence is partly because it’s the best tool for the job:

COMMENTS