Wednesday, November 18, 2015

Supercomputer Leaders Come Together on New Open Source Framework

The Linux Foundation and a consortium of high-performance computing (HPC) leaders last week announced the creation of the OpenHPC Collaborative Project, which has the goal of creating a new open source framework to support the world's most sophisticated HPC environments. Members of the new consortium include several national laboratories and academic supercomputing facilities, as well as Intel, Hewlett-Packard, Dell, and Lenovo. Currently, 97 percent of the world's fastest supercomputers run Linux. However, as systems become faster and more powerful, a new and dedicated solution for HPC increasingly is warranted. "The use of open source software is central to HPC, but lack of a unified community across key stakeholders...has caused duplication of effort and has increased the barrier to entry," says Linux Foundation executive director Jim Zemlin. "OpenHPC will provide a neutral forum to develop one open source framework that satisfies a diverse set of cluster environment use-cases." The new initiative also is being driven by increased interest in HPC among the private sector as an aid to big data analytics. Analysts note this is especially true in the world of finance, which has significantly increased its supercomputing efforts in recent years.


ZDNet (11/12/15) Steven J. Vaughn-Nichols
http://orange.hosting.lsoft.com/trk/click?ref=znwrbbrs9_5-e4f5x1d7fcx02465&

Top 10 Rising and Falling Buzzwords in Tech Job Postings Bloomberg Business

Some of the most popular tech buzzwords of a year ago, such as big data, no longer have the same cachet with job applicants, according to a study of more than 500,000 tech job postings. Older buzzwords are being replaced by newer terms, including artificial intelligence and real-time data. Among the top five buzzwords, only two were even on the map a year ago. Each term included in the study was measured using three main criteria: the number of people applying for a job containing the phrase, the percentage of those applicants with the skills and background to qualify, and the time it took to fill the role since the job was posted. The results were then ranked by changes in effectiveness from a year ago.

Artificial intelligence is one of the new tech buzzwords, with tech job applicants attracted by listings that mention AI. Over the past six months, the term's usage among the best-performing tech job listings has quintupled. Real-time data is also a hot new buzzword. Supposedly, this term conveys that the hiring company wants to build products based on the latest information, rather than just a lot of information. Other terms that are gaining in popularity include "high availability" and "robust and scalable." Also, thanks to the rising importance of diverse workplaces to many applicants, particularly in the tech industry, job postings benefit from a reference to "inclusiveness."

The term "big data" is the biggest loser in technology job ads. Two years ago, everything was about big data, but the term had started to drop off five or six months ago. Today, engineering jobs that mention "big data" perform 30% worse, on average, than those that do not. Also, corporate jargon turns off many applicants, so the term "virtual team," which refers to telecommuting, is more than 10 times as likely to appear in jobs with low applicant counts than in successful listings. Other buzzwords losing ground include "troubleshooting" and "subject matter expert."
Click Here to View Full Article 

Monday, November 16, 2015

Teaching Machines to Learn on Their Own

Scientific American (11/10/15) Larry Greenemeier; Steve Mirsky 

In an interview, Xerox Palo Alto Research Center CEO Stephen Hoover discusses the swift changes machine learning is undergoing. He says computers' growing ability "to understand in much deeper ways what it is that we're asking and trying to do" is starting to be incorporated into products, such as the Nest thermostat. Nest, for example, has a built-in agent that learns from user behavior and infers context so it can anticipate how to operate. Hoover says machine learning involves the machine deducing the right answer from data input and programming itself, instead of the programmer breaking down a task into a series of steps. "You're going to show the computer a bunch of instances and you're going to label it, and it's going to learn how to do it," he says. "There's a core code which is that learning algorithm, and then that's applied to multiple contexts.' Hoover credits Moore's Law with enabling continued advances in machine learning. "Hardware not only begets the capability to create new kinds of software like machine learning, but also is creating new ways to sense, measure, and control the world," he says. "And that feedback loop is again one of the big changes that we're going to see coming."

Get Ready for Your Digital Model

he Wall Street Journal (11/12/15) Pedro Domingos 

Within 10 years, people will entrust their data to machine-learning algorithms that build personal digital models of them, writes University of Washington professor Pedro Domingos. He predicts a new kind of company will be conceived to store, safeguard, and apply such data to the construction, maintenance, and interactions of these models. Domingos says it would record a customer's every digital interaction and feed it to the model in exchange for a subscription fee. He notes all this would require on the technical side is a proxy server through which these interactions are routed and recorded. "Once a firm has your data in one place, it can create a complete model of you using one of the major machine-learning techniques: inducing rules, mimicking the way neurons in the brain learn, simulating evolution, probabilistically weighing the evidence for different hypotheses, or reasoning by analogy," Domingos says. He thinks these models could be duplicated almost infinitely to multitask, selecting the best options for the user based on accumulated behavior and preferences. "To offset organizations' data-gathering advantages, like-minded individuals will pool the data in their banks and use the models learned from that information," Domingos says. He predicts cyberspace will evolve into "a vast parallel world that selects only the most promising things to try out in the real one--the new, global subconscious of the human race."

Sunday, November 15, 2015

Things Every Programmer Should Know | Java Code Geeks