Wednesday, August 24, 2011

Only Linux lebanese Blog


W3C Ignites Developer Participation in Web Standards Process

eWeek (08/16/11) Darryl K. Taft

The World Wide Web Consortium (W3C) recently announced a new track, called W3C Community Groups, that makes it easier for developers and businesses to create Web technology within W3C's international community of experts. The move is an effort to support the rapid evolution of Web technology, according to W3C officials. W3C Community Groups promote diverse participation, allowing anyone to propose a group, which leads to lots of small groups, even those that have minimal peer support. Additionally, there are no fees to participate and active groups can work indefinitely. "As the pace of innovation accelerates and more industries embrace W3C's Open Web Platform, Community Groups will accelerate incorporation of innovative technologies into the Web," says W3C CEO Jeff Jaffe. The W3C also recently announced the launch of Business Groups, which provide W3C members and non-members with a vendor-neutral forum to develop market-specific technologies that impact Web standards. "Developers can propose ideas to the extensive W3C social network, and in a matter of minutes start to build mindshare using W3C's collaborative tools or their own," says W3C's Harry Halpin.

Yahoo! Uses Facebook to Test Six Degrees of Separation

San Jose Mercury News (08/15/11) Mike Swift

Facebook and Yahoo! researchers have embarked on an experiment to determine how many online connections it takes, on average, for people to communicate a message to a person they do not know. The outcome could address lingering issues about the degrees of separation between people. Each member of Facebook has an average of 130 friends on the social network, which is visualized as a person's social graph. The current Small World experiment is designed to test people's effectiveness at transmitting a message from friend to friend to measure the actual level of connection between people, according to Yahoo! scientist Duncan Watts. The Facebook-Yahoo collaboration could cover a much wider participant population than Stanley Milgram's famous 1960s-era experiment, whose results were based on just 64 out of 300 letters that reached their targets. University of California, Berkeley professor Phil Cowan says the validity of the current experiment would be reliant on whether it is representative of the general populace, especially since users of social networks tend to be younger.

The A-Z of Programming Languages: From Pizza to Scala

Techworld Australia (08/18/11) Lisa Banks

Scala, Twitter's underlying programming language, was developed by Martin Odersky in an attempt to incorporate some functional elements, first within a language extension of Java, and then within a language that did not have Java's restrictions. "At first, this was an experiment, to answer the question whether we could achieve a tight integration between functional and object-oriented programming and whether this would lead to a useful programming model and language," Odersky says in an interview. Scala increasingly developed into an open source environment and then into a language that could be commercially supported. Odersky says one of the biggest challenges of developing Scala was having a language distinct from Java but that also is simultaneously interoperable with Java. He notes that Scala is increasingly used in projects inside Twitter. "Scala keeps much of the 'feel' of a dynamic language even though it is statically typed," Odersky says. "It thus tends to appeal to people who come from dynamic languages--not all of them but some of them."

Wednesday, August 17, 2011

DVD to Motorola XOOM : part one : K9Copy : DVD to iso image.






IconsPage/dvd-video.png K9Copy is a program that allows you to copy DVDs (and other audio-video media) in Linux. It has the following features:
  • Video compression (to make the video fit on a 4.7GB recordable DVD, or any size desired)
  • DVD burning
  • Creation of ISO images
  • Choice of audio and subtitle tracks to be copied
  • Title preview (video only)
  • Preservation of original menus
IconsPage/warning.png Warning: As always, check the relevant copyright laws for your country regarding the backup of any copyright-protected DVDs and other media.

Installing K9Copy from the Repositories


K9Copy requires the multiverse repository - enable this in Software Sources
software-sources.png
Search for the package k9copy in Synaptic, Adept, KPackageKit, or other package manager.
IconsPage/IconGNOMETerminal.png Or, to install via terminal:
  •  sudo apt-get install k9copy

Activate Support for Encrypted DVDs


Encrypted DVD playback requires the installation of libdvdcss2 from the Medibuntu repositories. libdvdcss will need to be enabled for K9Copy to have full functionality.
See RestrictedFormats for more information on playing encrypted DVDs and other non-free media.
See http://packages.medibuntu.org/ for links to the current package versions as it may have been updated since the writing of this how-to. http://packages.medibuntu.org/<Release-Name>/libdvdcss2.html (replacing <Release-Name> with the name of your release; http://packages.medibuntu.org/karmic/libdvdcss2.html for Karmic) for the exact page (Medibuntu URL format as of 2010-02-06).

Using K9Copy


IconsPage/navigate.png K9Copy should now appear under Menu -> Applications -> Sound & Video in Gnome (Ubuntu) or Menu -> Multimedia in KDE (Kubuntu) or XFCE (Xubuntu). You can also start K9Copy from a command-line terminal (or by pressing Alt + F2):
 k9copy

Quickstart guide


K9Copy works in both KDE and Gnome. To copy a normal DVD and shrink it down to 4.7Gb, set the input device to your DVD and output as ISO image. Open the video using the 'folder' icon at top left. Under the MPEG4 Encoding tab, select Video codec as copy. Ticking the 2 pass box will give a higher quality copy. Set file size to say, 4700MB (note that there have been reports e.g.that this value is sometimes slightly exceeded).
Select the title(s) of interest. Then press the circular to DVD icon. It will ask for a location to create the iso copy. Note that when it starts, it misleadingly says 'Burning DVD' - what its actually doing is creating the iso copy on the hard drive - rest assured the DVD is NOT being overwritten! A dvd can then be created from the resulting iso file with another program eg. Nautilus (right click on the iso) or K3b. Although k9copy can also burn the iso (or a DVD structure), using a separate program has been reported to be more reliable.
Tip: To preview the titles, highlight the title/chapter of interest and click on the 'movie camera' icon - only THEN can you use the stop/play buttons in the preview window.
If you wish to change the Menu structure (reauthor the DVD), you can use qdvdauthor.

Tuesday, August 16, 2011

DOE at Work on Scientific 'Knowledgebase'

Internet Evolution (08/08/11) Ariella Brown

The U.S. Department of Energy (DOE) recently completed the research and development phase of the Systems Biology Knowledgebase (Kbase) project, which aims to make plant and microbial life data more accessible for scientific sharing and integration. Kbase will be useful to those who want to apply the project's data, metadata, and tools for modeling and predictive technologies to help the production of renewable biofuels and a reduce carbon in the environment, according to DOE researchers. The integration of the data, combined with computational models, is expected to greatly accelerate scientific advancements. "There are many different 'silos' of information that have been painstakingly collected; and there are a number of existing tools that bring some strands of data into relation," says Michael Schatz, a quantitative biologist involved in the project. "But there is no overarching tool that can be used across silos." Kbase would enable different labs to submit data and to advance the body of knowledge and spur innovations in predictive biology. DOE hopes Kbase will evolve into a system that can grow as needed and be used by scientists without extensive training in applications.

NSF's Seidel: 'Software Is the Modern Language of Science'

HPC Wire (08/09/11) Jan Zverina

Software is the core technology behind a dramatic acceleration in all areas of science, and the U.S. National Science Foundation (NSF) and the science community are faced with the challenge of building a cyberinfrastructure model that integrates advancing technologies, says NSF's Edward Seidel. NSF's Extreme Science and Engineering Discovery Environment (XSEDE) program, the successor to the TeraGrid project, is intended to be the most powerful cluster of advanced digital resources and services in the world. "I think XSEDE probably marks the beginning of a national architecture with the capability of actually putting some order into [multiple approaches to observation, experimentation, computation, and data analysis]," says Seidel. Technological innovations are creating a multilevel cybercrisis, including the problem of managing the exponentially increasing data volumes produced by various digital resources, according to Seidel. This flood of data offers opportunities for potentially powerful national and possibly global collaborations. "We need to be thinking about developing cyberinfrastructure, software engineering, and capabilities to mix and match components, as well as data sharing policies, that really enable scenarios such as coupled hurricane and storm surge prediction, as well as the human response to such events," Seidel says.

Friday, August 12, 2011

How Computational Complexity Will Revolutionize Philosophy

Technology Review (08/10/11)

Massachusetts Institute of Technology computer scientist Scott Aaronson argues that computational complexity theory will have a transformative effect on philosophical thinking about a broad spectrum of topics such as the challenge of artificial intelligence (AI). The theory focuses on how the resources required to solve a problem scale with some measure of the problem size, and how problems typically scale either reasonably slowly or unreasonably rapidly. Aaronson raises the issue of AI and whether computers can ever become capable of human-like thinking. He contends that computability theory cannot provide a fundamental impediment to computers passing the Turing test. A more productive strategy is to consider the problem's computational complexity, Aaronson says. He cites the possibility of a computer that records all the human-to-human conversations it hears, accruing a database over time with which it can make conversation by looking up human answers to questions it is presented with. Aaronson says that although this strategy works, it demands computational resources that expand exponentially with the length of the conversation. This, in turn, leads to a new way of thinking about the AI problem, and by this reasoning, the difference between humans and machines is basically one of computational complexity.

PCAST Sustainability Report Emphasizes 'Informatics Technologies'

CCC Blog (08/10/11) Erwin Gianchandani

Better accounting of ecosystem services and broader protection of environmental capital are the recommendations of a new report from the U.S. President's Council of Advisers on Science and Technology (PCAST). The report calls on the federal government to institute and finance a Quadrennial Ecosystems Services Trends Assessment that taps existing monitoring programs and newly recommended activities to identify trends associated with ecosystem sustainability and possible policy responses, and recommends the expansion of informatics technologies. "The collection of data is an essential first step in the powerful and rapidly developing new field of informatics," PCAST notes in its report. "The power of these techniques for deriving insights about the relationships among biodiversity, other ecosystem attributes, ecosystem services, and human activities is potentially transformative." To address the extreme heterogeneity of the data, PCAST urges the establishment of an informatics infrastructure embodying mechanisms for making data openly and readily available in formats accessible to both human and machines. The council also presses for the provision of standards that foster interoperability, and decision support software incorporating insights from government, industry, and academia.

Producing Open Source Software

Producing Open Source Software

Advanced Linux programming


If you are a developer for the GNU/Linux system, this book will help you to:
  • Develop GNU/Linux software that works the way users expect it to.
  • Write more sophisticated programs with features such as multiprocessing, multi-threading, interprocess communication, and interaction with hardware devices.
  • Improve your programs by making them run faster, more reliably, and more securely.
  • Understand the preculiarities of a GNU/Linux system, including its limitations, special capabilities, and conventions.

Wednesday, August 10, 2011

IT Solution to Improve Hospital Workflow and Schedules

Queensland University of Technology (08/03/11) Sandra Hutchinson
Researchers at the Queensland University of Technology (QUT) and GECKO have successfully tested an information technology system designed to improve the scheduling of resources and workflow in surgical settings. "This positive assessment by the Hetzelstift clinicians is backed up by clinicians from another hospital to whom the system was demonstrated on a single laptop, but without deploying the system in the hospital's environment," says QUT's Chun Ouyang. The solution is based on an automated workflow system known as YAWL, which is one of QUT's largest open source initiatives, and improved management of surgery-related resources. "The benefit of this system, over a human-run system, is that it is more accurate, efficient, and quicker and is able to adapt to any changes in the availability of staff, equipment, or delays immediately," Ouyang says. The system has the potential to help save time, enabling hospitals to serve more patients, and help save money on equipment purchases.

Wireless Network in Hospital Monitors Vital Signs

Washington University in St. Louis (08/04/11) Diana Lutz
Washington University in St. Louis researchers launched a prototype sensor network in Barnes-Jewish Hospital, with the goal of creating a wireless virtual intensive-care unit in which the patents are free to move around. When the system is fully operational, sensors will monitor the blood oxygenation level and heart rate of at-risk patients once or twice a minute. The data is transmitted to a base station and combined with other data in the patient's electronic medical record. The data is analyzed by a machine-learning algorithm that looks for signs of clinical deterioration, alerting nurses to check on patients when those signs are found. The clinical warning system is part of a new wireless health field that could change the future of medicine, says Washington University in St. Louis computer scientist Chenyang Lu. In developing the system, the researchers were focused on ensuring that the network would always function and never fail. The relay nodes are programmed as a self-organizing mesh network, so that if one node fails the data will be rerouted to another path. At the end of the trial, the researchers found that data were reliably received more than 99 percent of the time and the sensing reliability was 81 percent.

Monday, August 8, 2011

Above the Clouds: An Interview With Armando Fox

HPC in the Cloud (08/02/11) Nicole Hemsoth

University of California, Berkeley professor Armando Fox, co-founder of the Reliable, Adaptive, and Distributed Systems Laboratory, co-wrote a paper that outlined some early challenges and advantages of high performance computing (HPC) clouds. He says in an interview that "there's a huge and largely untapped 'new middle class' of scientific computing users who would immediately benefit from running medium-to-large jobs on public clouds." Among the reasons for this is the fact that experiments on the public cloud can go faster because the provisioning of virtualized machines is quick and obviates the need for users to wait for their turn to use computer resources, while cost associativity also is realized. Although Fox acknowledges that the current cloud computing architecture may not be appropriate for all scientific computing users, he says that much can be done to customize cloud offerings to HPC. "And the exciting part here is that because of the scale and volume of commodity clouds, scientific computing users have a chance to do something they've never really had before--to influence the design of commodity equipment," Fox notes. He projects that cloud computing will be the sole means for performing big data analytics.
View Full Article

Computing Giants Launch Free Science Metrics

Nature (08/02/11) Declan Butler

Google and Microsoft recently launched free tools that will enable researchers to analyze citation statistics, visualize research networks, and track the most popular research fields. Google Scholar recently added Google Scholar Citations (GSC), which lets researchers create personal profiles showing all their articles in the Google Scholar database. The profile also shows how many times the papers have been cited, as well as other citation metrics such as the h-index, which measures the productivity of a scientist and the impact of their publications. Microsoft Academic Search (MAS) recently added a suite of tools, including visualizations of citation networks, publication trends, and rankings of the leading researchers in a field. Although Microsoft's platform has more features, GSC has a significant size advantage, which makes its metrics more accurate and reliable, according to researchers. MAS' content, which has grown from 15.7 million to 27.1 million publications between March 2011 and June 2011, is still a developing offering in the community, according to Microsoft Research Connections' Lee Dirks. "This is not about competition, this is about providing an open platform for academic research," Dirks says.
View Full Article

China's Supercomputing Goal: From 'Zero to Hero'


NPR Online (08/02/11) Louisa Lim

Developing supercomputers that rank among the world's fastest and most powerful has become a priority of the Chinese government, and China is the home of the Tianhe 1-A, the world's second fastest supercomputer. National Supercomputer Center director Liu Guangming expects supercomputers to become critical contributors to Chinese economic and scientific development. "The key is that this supercomputer can fulfill our needs, including mineral exploration, bioengineering, patents, pharmaceuticals, and gene sequencing," he notes. However, critics contend that much of the compatible software can only use the Tianhe 1-A's central processing units and not the machine's graphics processing units. "Most of the people who use the national supercomputer are animation and games people," says Cao Jianwen with the Chinese Academy of Sciences. Liu counters this notion, noting for instance that the Tianhe 1-A has played an important role in China's energy development. Still, the United States is far ahead of China in terms of software development, but China aims to ramp up its supercomputing innovation by building related facilities and all-Chinese chips.
View Full Article

The Science of Cyber Security

National Science Foundation (08/04/11) Marlene Cimons The University of California, Berkeley's Team for Research in Ubiquitous Secure Technology (TRUST) is developing cybersecurity science and technology designed to change how organizations design, build, and operate trustworthy information systems. One of TRUST's long-term goals is to build a science base that will lead to new cybersecurity defense systems. "We believe what is missing is the science of cybersecurity--a science base, like the kind taught in medical schools, so as to enable doctors to treat and help patients," says Berkeley's Shankar Sastry. "We want the legacy of TRUST to be the start of this science base, upon which an inherent defense system can be built that will operate almost like the body's in the event of an attack." TRUST's research partners include Carnegie Mellon, Cornell, San Jose State, Stanford, and Vanderbilt universities, as well as Intel, Cisco Systems, IBM, Symantec, and Qualcomm. TRUST wants to improve anti-identity theft systems and technology that secures sensitive documents such as medical records. For example, TRUST is working with Vanderbilt's medical school on a pilot project to research privacy issues involved with medical and billing information. TRUST also is developing a cybersecurity education program.
View Full Article

Andoid developers

The lebanese mirror of the Android developers site.

Friday, August 5, 2011

Researchers Give Robot Ability to Learn

PhysOrg.com (08/02/11) Bob Yirka 

A type of self-replicating neural technology developed by researchers at the Tokyo Institute of Technology can give robots the ability to learn and perform new tasks. The neural technology, called the Self-Organizing Incremental Neural Network, determines what to do next in a given situation by storing information in a network that is designed to mimic the human brain. In a demonstration, the researchers asked a robot to fill a cup with water from a bottle, which it performed using predefined instructions. The researchers then asked the robot to cool the beverage while in the middle of carrying out the same instructions as before. The robot, which is holding the cup in one hand and the bottle in the other, paused to consider what it must do to carry out the request. It set the bottle down, reached over to retrieve an ice cube, and placed it in the cup. The researchers say the technology enables a robot to learn from its experiences, use the Internet to research how to do things, and potentially learn from similar robots that have already performed the tasks it wants to do.

Women Hitting a Wall in IT

NextGov.com (08/02/11)
Brittany Ballenstedt Most women in information technology (IT) careers remain in junior or mid-level management positions, and many believe their male colleagues are being promoted ahead of them, according to a Women in Technology survey. Sixty-one percent of the respondents have more than 10 years of experience in the tech sector, but only 26 percent have reached senior management or board level positions. "These results indicate that the big strides toward equality that we had hoped for in 2007 have not yet happened and that the gender balance in the workplace still has a long way to go," says the survey's report. The latest findings are similar to the results from 2007, when the vast majority of respondents had not taken and did not intend to take a career break after maternity. The report also notes that there are several benefits available to women in IT. Eighty percent of employers offer remote working options and 75 percent offer flexible working hours. Salary, benefits, career opportunities, and flexible work hours are the top reasons why women apply for jobs with certain organizations.
View Full Article

Intel Invests $30 Million in Cloud, Embedded Research Centers

eWeek (08/03/11) Jeffrey Burt

Intel is investing $30 million in two new research centers at Carnegie Mellon University that will focus on cloud computing and embedded technology. The two Intel Science and Technology Centers (ISTCs) are part of a $100 million research effort Intel launched in January designed to produce technological advances that can be used throughout the industry. The ISTCs use open intellectual property models, and the research results will become publicly available through technical publications and open source software releases. The cloud computing center is part of Intel's Cloud 2015 initiative, which is designed to push innovation to enable businesses and consumers to share data securely across public and private clouds. The researchers will pursue extending the cloud to the network edge and client devices and cloud technologies such as built-in application optimization. "In the future, these capabilities could enable a digital personal handler via a device wired into your glasses that sees what you see, to constantly pull data from the cloud and whisper information to you during the day--telling you who people are, where to buy an item you just saw, or how to adjust your plans when something new comes up," according to Intel.

Research Targets Computational Experiments in the Cloud

HPC in the Cloud (07/27/11) Nicole Hemsoth

A St. Andrews University research team recently launched an investigation into the viability of running computational experiments on virtualized hardware. Researcher Lars Kothhoff says in an interview that reliability and reproducibility of results is critical to such research, especially when it involves deploying complex systems that solve artificial intelligence problems. "Our results show that there is no inherent disadvantage to using virtualized resources for such experiments as long as the results are carefully evaluated and statistical methods used to identify outliers and establish confidence in the conclusions drawn from the results," he notes. Kothhoff says the best case for running central processing unit (CPU)-intensive tasks on cloud resources is likely one that involves many experiments or experiments that can be easily distributed across multiple virtual machines when there are not sufficiently available physical resources. He says that getting reliable CPU timings is a substantial difficulty only for the small types of virtual machines in a cloud. "For larger types, the reliability becomes as good as on physical hardware or even better," Kothhoff says.

Thursday, August 4, 2011

The Linux Documentation Project: Guides

The Linux Documentation Project: Guides

Why managing project with the Scrum framework?


  • Scrum is an agile process to manage and control development work.
  • Scrum is a wrapper for existing engineering practices.
  • Scrum is a team-based approach to iteratively, incrementally develop systems and products when requirements are rapidly changing
  • Scrum is a process that controls the chaos of conflicting interests and needs.
  • Scrum is a way to improve communications and maximize co-operation.
  • Scrum is a way to detect and cause the removal of anything that gets in the way of developing and delivering products.
  • Scrum is a way to maximize productivity.
  • Scrum is scalable from single projects to entire organizations. Scrum has controlled and organized development and implementation for multiple interrelated products and projects with over a thousand developers and implementers.
  • Scrum is a way for everyone to feel good about their job, their contributions, and that they have done the very best they possibly could.

Wednesday, August 3, 2011

SCRUM

The PDF document contains : About Scrum, Scrum Roles, Scrum Meetings, Scrum Artifacts, Scaling, Related Practices and More... download it

Data Centers' Power Use Less Than Was Expected

Data Centers' Power Use Less Than Was Expected
New York Times (07/31/11) John Markoff

Stanford University researchers have found that the global recession and new power-saving technologies has resulted in a reduction in power usage by data centers. The actual number of computer servers has declined over the last three years, and the emergence of technologies such as more efficient computer chips and computer server virtualization has led to the reduced power needs, says Stanford professor Jonathan G. Koomey. "Mostly because of the recession, but also because of a few changes in the way these facilities are designed and operated, data center electricity consumption is clearly much lower than what was expected, and that's really the big story," Koomey says. Industry experts agree with Koomey's analysis, but many think the slower growth might be temporary. The slower growth rate is significant because it comes at a time when data centers are being built at a record pace, mainly due to companies such as Amazon, Apple, Google, Microsoft, and Facebook operating huge data centers for their millions of users. Most data center designers use standard industry equipment, while a few companies, such as Google, build custom servers for their data centers, making them more efficient than mainstream data centers.

Monday, August 1, 2011

Online Open Source Project Management


Why choose dotProject Project Management software?

dotProject is an Open Source Project Management application, supported free of charge by web developers from all over the world. dotProject is written in PHP and is free to anyone who would like to download and use it. It is easy to work with DotProject because it has clean and simple user interface. Its core features include:
  • User Management
  • Ticketing Support System provided via email
  • Client Management
  • Task Listing
  • File Archive
  • Contact List
  • Calendar
  • Discussion Forum
With eurabiahosting dotProject hosting services each customer receives a free installation of dotProject on his/her website. Our support team has experience in managing dotProject and will make sure that your project management site runs trouble-free!