Wednesday, October 17, 2012

INDUSTRY NEWS by ACM technews

INDUSTRY NEWS

Panetta Warns of Dire Threat of Cyberattack on U.S.
New York Times (10/12/12) Elisabeth Bumiller
                ; Thom  Shanker

U.S. Defense Secretary Leon Panetta yesterday warned of the potential for disastrous consequences if an enemy of the U.S. were to carry out a cyberattack on the nation's critical infrastructure.  Panetta says the U.S.'s adversaries are becoming increasingly aggressive and are improving their technology, so much so that they could launch cyberattacks on vulnerable computer systems used to operate the power grid, transportation system, financial networks, and the government.  He says these attacks could result in the derailment of passenger trains carrying dangerous chemicals, the contamination of water supplies in major U.S. cities, or the failure of the nation's power grid.  Panetta says the most worrisome scenario is a cyberattack on critical infrastructure carried out in tandem with a physical attack, which would amount to a cyber-Pearl Harbor that would cause physical destruction and the loss of life, and could terrorize the populace to such an extent that it would create "a profound new sense of vulnerability."  However, Panetta says improved cyberdefenses alone will not prevent a cyberattack against the nation's critical infrastructure, which is why the Defense Department has developed the ability to conduct "effective operations" to mitigate threats to U.S. interests in cyberspace.
http://www.nytimes.com/2012/10/12/world/panetta-warns-of-dire-threat-of-cyberattack.html


What Are Grand Technology and Scientific Challenges for the 21st Century?
Network World (10/10/12) Michael Cooney

The U.S. Defense Advanced Research Projects Agency and the White House Office of Science and Technology Policy (OSTP) recently put out a public call for ideas that could form Grand Challenges, which are ambitious but achievable goals that would lead to substantial breakthroughs in science and technology.  Grand Challenges would have a major impact on fields such as health, energy, sustainability, education, economic opportunity, national security, and human exploration.  They also would help drive and harness innovation and advances in science and technology.  Other organizations also have called for challenges to promote science and technology innovation.  For example, X Prize recently announced its top list of eight key challenges that could become public competitions in the near future, which include a personal health monitoring system, brain-computer interfaces, wireless power transmission, and ultra-fast point-to-point travel.  In addition, the National Research Council recently highlighted five challenges that focus on new optics and photonics technologies and improving the energy grid.  OSTP deputy director Thomas Kalil also has highlighted several challenges and ideas, such as encouraging research universities to launch the Grand Challenge Scholars Program, which enables engineering students to organize their coursework, research, service, international studies, and experiential learning.
http://www.networkworld.com/community/node/81573


First Evidence for Iran's Parallel Halal Internet
New Scientist (10/10/12) Sara Reardon

Iranian officials have long discussed developing a religiously acceptable internal network, known as the "halal" Internet, which is isolated from the World Wide Web, and security researcher Collin Anderson recently found evidence that elements of this parallel Internet have already been created.  Anderson found that telecommunications companies in Iran allocate two Internet Protocol (IP) addresses to every machine that connects to the Internet.  One of the IP addresses is a conventional Web address, while the other one is only accessible from within the country.  The internal network can handle up to about 17 million IP addresses, and already has about 10,000 connected devices, including those in private homes, government buildings, and e-commerce sites.  The network also contains academic Web sites and email services.  Anderson speculates that the internal network will contain Iran-specific content and own-brand versions of popular services.  The government would then hold back connections to outside networks, making them unusably slow and forcing Iranian users onto the national network, Anderson suggests.  However, he notes Iranians will still be able to access the Internet through the anonymizing network Tor and virtual private networks.
http://www.newscientist.com/article/mg21628865.700-first-evidence-for-irans-parallel-halal-internet.html


DHS Urged to Create Reserve Cadre of Cyber Experts
NextGov.com (10/11/12) Aliya Sternstein

The U.S. Department of Homeland Security (DHS) should develop a reserve army of cyberspecialists from across government and industry to address emergencies, according to a cyber skills task force report.  The report says the National Guard-like group of cyberexperts, called the CyberReserve, will ensure that capable professionals are ready in times of national crisis.  The report notes that the CyberReserve will be successful if DHS maintains current information on relevant former personnel now at other agencies and companies, as well as unaffiliated experts from government and industry.  Congress also has pushed for a formal cyber national guard, according to DHS deputy secretary Jane Holl Lute.  A 2002 law allowed Homeland Security to create a NET Guard comprising volunteer experts from across the country for cyberresponse.  Lute expects the department would look to Defense Department components and veterans organizations, as well as outside groups, for people with the necessary skills.  "Surge rosters do require active management," Lute says.  "It's not something where you type up a page and throw it in the drawer.  People's skills have to be current."
http://www.nextgov.com/cybersecurity/2012/10/dhs-urged-create-reserve-cadre-cyber-experts/58704/


MIT's CSAIL Launches New Center to Tackle the Future of Wireless and Mobile Technologies
MIT News (10/11/12)

The Massachusetts Institute of Technology's (MIT's) Computer Science and Artificial Intelligence Laboratory recently launched Wireless@MIT, an interdisciplinary center focused on developing next-generation wireless networks and mobile devices.  The center, which will tackle some of the most pressing issues facing the wireless and mobile-computing fields, will involve more than 50 MIT faculty members, research staff, and graduate students across various labs and academic departments, and will work with seven founding industry affiliates.  "The goal of our center is to push the frontiers of wireless research to their full potential, and to ensure that the industry that grows up around these new devices is able to work in innovative and productive ways," says MIT professor Hari Balakrishnan.  The center will focus on the constantly decreasing radio spectrum caused by booming wireless systems, finding ways to reduce power consumption and extend battery life on mobile devices, and developing new applications that accommodate mobility and network variability.  "The center aims to unleash a wide range of mobile uses that will change the way we live, work, and entertain," says MIT professor Dina Katabi.  The researchers currently are working on projects involving transportation, health care, education, collaboration, and environmental sustainability.
http://web.mit.edu/newsoffice/2012/wireless-research-center-founded-1011.html


Digital Tabletop System With Views on Demand
Bristol University (10/09/12)

Scientists presented a tabletop system that supports mixed-focus collaborative tasks during the 25th ACM UIST 2012 symposium.  Personalized view-overlays for tabletops (PiVOT) uses two view zones to provide individual users with personalized views, while presenting an unaffected and unobstructed shared view to all users.  The system supports multiple personalized views, which can be in the same spatial location and yet be visible only to the users it belongs to. "For example, when looking at a city map, if I want to see traffic information I can lean forward and see the traffic-overlay while other people can, at the same time, lean forward and see the elevation information for a particular street," says Bristol professor Sriram Subramanian.  "Everyone else who is not leaning forward will continue to see the undistorted city map.  Their view will not interfere with mine, even if it is on the same spatial location."  Bristol researchers led the development of PiVOT, which also allows the creation of personal views that can be either two-dimensional or auto-stereoscopic three-dimensional images.  The team used an arrangement of liquid crystals to create the tabletop system.
http://www.bristol.ac.uk/news/2012/8846.html


Robots Using Tools: With New Grant, Researchers Aim to Create 'MacGyver' Robot
Georgia Tech News (10/09/12) John Toon

Georgia Tech researchers are studying ways to give robots the ability to use objects in their environments to accomplish high-level tasks.  "Our goal is to develop a robot that behaves like MacGyver ... who solved complex problems and escaped dangerous situations by using everyday objects and materials he found at hand," says Georgia Tech professor Mike Stilman.  The researchers are developing algorithms for robots that make tasks that are impossible for a robot alone be possible for a robot with tools.  The algorithms will enable a robot to identify an arbitrary object in a room, determine its potential function, and turn it into a simple machine that can be used to complete an action.  Stilman is collaborating with Institute for the Study of Learning and Expertise director Pat Langley and University of Kansas professor Dongkyu Choi.  Langley and Choi will expand the cognitive architecture they developed, which provides an infrastructure for modeling various human capabilities in robots.  "We believe a hybrid reasoning system that embeds our physics-based algorithms within a cognitive architecture will create a more general, efficient, and structured control system for our robot that will accrue more benefits than if we used one approach alone," Stilman says.
http://www.gatech.edu/newsroom/release.html?nid=160721


Free Program Makes Computer Graphics More Realistic
Cornell Chronicle (10/08/12) Bill Steele

Cornell University researchers have developed a new version of Mitsuba, a free, open source rendering program used by computer graphics researchers worldwide.  "The goal of my project is to create cutting-edge software that makes it considerably easier," say Cornell Ph.D. student Wenzel Jakob.  The new version offers an improved user interface, as well as mathematical advances that accelerate processing and enhance realism.  Jakob notes that in academia there is a drive for realism that has brought forth new developments, but these are only slowly making their way into commercial software.  "What really is new is that Mitsuba implements a group of rendering algorithms that traditionally have been horribly complicated," he says.  For example, the new version includes an algorithm called Metropolis Light Transport, which manages the complex behavior of light traveling through glossy materials such as brushed metal or glass.  "It's been very rewarding to watch this software grow from a small project a few years back into one of the most sophisticated renderers available," says Cornell professor Steve Marschner.
http://www.news.cornell.edu/stories/Oct12/Mitsuba.html


The Seeds That Federal Money Can Plant
New York Times (10/06/12) Steve Lohr

Government support is key to facilitating new ideas that are harvested by the private sector, creating companies and jobs, according to a recent U.S. National Research Council (NRC) report.  The report examined eight computing technologies and found that the portion of revenue at 30 well-known corporations that could be traced back to the seed research backed by government agencies totaled almost $500 billion a year.  "If you take any major information technology company today, from Google to Intel to Qualcomm to Apple to Microsoft and beyond, you can trace the core technologies to the rich synergy between federally funded universities and industry research and development," says Microsoft's Peter Lee, who led the NRC committee that produced the report.  However, government research funding is threatened by the Budget Control Act, which is scheduled to take effect in January, calls for across-the-board cuts in discretionary spending.  A recent American Association for the Advancement of Science study found that federal spending on research and development would be trimmed by more than $12 billion in 2013, and the National Science Foundation would have its budget cut by more than $450 million.
http://www.nytimes.com/2012/10/07/technology/making-the-case-for-a-government-hand-in-research.html


Australian National University to Switch on Largest Supercomputer Next Week
Computerworld Australia (10/05/12) Byron Connolly

The Australian National University (ANU) soon will begin testing the country's most powerful supercomputer, which uses Fujitsu's PRIMERGY x86 high performance computing clustered design and Intel Xeon E5 central processing units to provide up to 1.2 petaflops of processing power.  The system also has 176 terabytes of memory and 12 petabytes of disk storage.  About half of the supercomputer's power will be dedicated to modeling earth systems such as weather and long-term climate change, says ANU professor Lindsay Botten.  Researchers want to use the supercomputer to provide seasonal weather modeling over a few months rather than a few days, Botten notes.  The system's size and memory capacity will enable researchers to work at much higher resolution to gain more accurate results about the potential impact of severe thunderstorms.  In the future, other government agencies, universities, and private enterprises also will have access to the supercomputer.  The $100 million, four-year supercomputing project is a partnership between the ANU and several other Australian universities, the Commonwealth Scientific and Industrial Research Organization, the Bureau of Meteorology, Geoscience Australia, and the Australian government.
http://www.computerworld.com.au/article/438342/australian_national_university_switch_largest_supercomputer_next_week/


Why NASA Thinks a Supercomputer on the Moon Might Not Be Pure Fiction
Government Computer News (10/05/12) John Breeden II

Southern California University graduate student Ouliang Chang recently proposed building a supercomputer on the moon.  And although the idea might seem like science fiction, the U.S. National Aeronautics and Space Administration (NASA) may be interested in it.  Since at least 2009, NASA has considered upgrading its Deep Space Network to a true Internet in space.  The current network supports space missions exploring the solar system and some Earth-orbiting missions.  However, there could be a need for Chang's moon base because space is getting too crowded to process all of the data coming from the various probes, satellites, and robots that are exploring the solar system.  The missions compete for time and bandwidth, and the situation will only get worse.  Each time a new spaceship launches is like adding a new client to the network, and the moon base concept would be akin to adding a new router and server to that network, which would accept signals from space, store them, process them when necessary, and then transmit the data back to Earth as time and bandwidth permits.  Chang wants to put a supercomputer data center inside a moon crater, facing toward deep space, and notes some interesting problems, such as the need to detect incoming asteroids.
http://gcn.com/articles/2012/10/05/supercomputer-on-the-moon.aspx


NASA Issues Big Data Challenge
InformationWeek (10/05/12) Patience Wait

The U.S. National Aeronautics and Space Administration (NASA), the Department of Energy, and the National Science Foundation recently created the Big Data Challenge, a competition to develop new ways to capitalize on the data generated by the federal government.  The Big Data Challenge aims to identify ways to analyze and share large amounts of data across government, using information sets taken from the health, energy, and earth science fields.  The challenge involves four contests, the first of which is an ideation challenge that seeks ideas for tools and techniques that can smooth out the gaps in disparate data sources and subjects.  The top three finishers will each receive $500 in prize money.  "Big data is characterized not only by the enormous volume or the velocity of its generation, but also by the heterogeneity, diversity, and complexity of the data," says the Big Data Senior Steering Group's Suzi Iacono.  "There are enormous opportunities to extract knowledge from these large-scale diverse data sets, and to provide powerful new approaches to drive discovery and decision-making, and to make increasingly accurate predictions."  The contest will be hosted by the NASA Tournament Lab.
http://www.informationweek.com/government/information-management/nasa-issues-big-data-challenge/240008575


Creative Blocks
Aeon Magazine (10/03/12) David Deutsch

Despite the universality of computation dictating that artificial general intelligence (AGI) must be possible, progress in this field has been stalled for decades, writes University of Oxford physicist David Deutsch.  Underlying the stagnation is a failure to realize that what makes human brains distinctive from other physical systems is qualitatively divergent from all other functionalities and cannot be specified in the way that all other properties of computer programs can be.  A new philosophy whose central tenet is a theory that explains how brains generate explanations is required.  For now, scientists are locked in the conventional wisdom that theories are derived from induction, and this is constraining AGI progress by erroneously assuming that thinking is a process of anticipating that future patterns of sensory experience will resemble past ones.  In fact, thinking involves criticism and correction of partially true guesses with the goal of pinpointing and removing errors and misconceptions in those guesses, as opposed to producing and justifying extrapolations from sense data.  The induction approach to AGI greatly relies on Bayesianism, which assumes that minds function by assigning likelihoods to ideas and modifying them in the light of experience as a way of choosing how to act.  Deutsch says this only permits a behavioristic model of an AGI's values.
http://www.aeonmagazine.com/being-human/david-deutsch-artificial-intelligence/

No comments:

Post a Comment

The Open Source Web Hosting company