Technology Development – Turning Seawater into Jet Fuel

Converting the carbon dioxide and hydrogen into hydrocarbons that can then be used to develop JP-5 fuel stock. The technology has an economically viable widespread applicability.

This article first published in the New Energy & Fuel on September 25, 2012

Scientists at the U.S. Naval Research Laboratory (NRL) are developing a process to extract carbon dioxide (CO2) and produce hydrogen gas (H2) from seawater.  Then they catalytically convert the CO2 and H2 into jet fuel by a gas-to-liquids process.

The NRL effort has successfully developed and demonstrated technologies for the recovery of the CO2 and the production of the H2 from seawater using an electrochemical acidification cell, and the conversion of the CO2 and H2 to hydrocarbons that can be used to produce jet fuel.

Electrochemical Acidification Carbon Capture Skid. Click image for more info.

NRL research chemist Dr. Heather Willauer said, “The potential payoff is the ability to produce JP-5 jet fuel stock at sea reducing the logistics tail on fuel delivery with no environmental burden and increasing the Navy’s energy security and independence.”  JP-5 is very close chemically to kerosene and diesel.

Willauer continues, “The reduction and hydrogenation of CO2 to form hydrocarbons is accomplished using a catalyst that is similar to those used for Fischer-Tropsch reduction and hydrogenation of carbon monoxide. By modifying the surface composition of iron catalysts in fixed-bed reactors, NRL has successfully improved CO2 conversion efficiencies up to 60%.”

Technically, the NRL has developed a two-step laboratory process to convert the CO2 and H2 gathered from the seawater to liquid hydrocarbons. In the first step, an iron-based catalyst can achieve CO2 conversion levels up to 60% and decrease unwanted methane production from 97% to 25% in favor of longer-chain unsaturated hydrocarbons (olefins). Then in step two the olefins can be oligomerized (a chemical process that converts monomers, molecules of low molecular weight, to a compound of higher molecular weight by a finite degree of polymerization) into a liquid containing hydrocarbon molecules in the carbon C9-C16 range, suitable for conversion to jet fuel by a nickel-supported catalyst reaction.

The raw materials are abundant.  CO2 is an abundant carbon source in seawater, with the concentration in the ocean about 140 times greater than that in air. Two to three percent of the CO2 in seawater is dissolved CO2 gas in the form of carbonic acid, one percent is carbonate, and the remaining 96 to 97% is bound in bicarbonate. When processes are developed to take advantage of the higher weight per volume concentration of CO2 in seawater, coupled with more efficient catalysts for the heterogeneous catalysis of CO2 and H2, a viable sea-based synthetic fuel process could be developed.

The NRL effort made significant advances developing carbon capture technologies in the laboratory. In the summer of 2009 a standard commercially available chlorine dioxide cell and an electro-deionization cell were modified to function as electrochemical acidification cells. Using the novel modified cells both dissolved and bound CO2 were recovered from seawater by re-equilibrating carbonate and bicarbonate to CO2 gas at a seawater pH below 6. In addition to CO2, the cells produced H2 at the cathode as a by-product.

Note that the oceans offer a huge reserve of raw materials for fuel production.

The completed studies of 2009 assessed the effects of the acidification cell configuration, seawater composition, flow rate, and current on seawater pH levels. The data were used to determine the feasibility of this approach for efficiently extracting large quantities of CO2 from seawater. From these feasibility studies NRL successfully scaled-up and integrated the carbon capture technology into an independent skid, or “lab on a pallet’ so to speak, called a “carbon capture skid” to process larger volumes of seawater and evaluate the overall system design and efficiencies.

The carbon capture skid’s major component is a three-chambered electrochemical acidification cell. The cell uses small quantities of electricity to exchange hydrogen ions produced at the anode with sodium ions in the seawater stream. As a result, the seawater is acidified. At the cathode, water is reduced to H2 gas and sodium hydroxide (NaOH) is formed. This basic solution may be re-combined with the acidified seawater to return the seawater to its original pH with no additional chemicals. Current and continuing research using the carbon capture skid demonstrates the continuous efficient production of H2 and the recovery of up to 92% of the CO2 from seawater.

The carbon capture skid has been tested using seawater from the Gulf of Mexico to simulate conditions that will be encountered in actual open ocean processing.

The NRL group is working now on process optimization and scale-up.  Initial studies predict that jet fuel from seawater would cost in the range of $3 to $6 per gallon to produce.

Willauer points out, “With such a process, the Navy could avoid the uncertainties inherent in procuring fuel from foreign sources and/or maintaining long supply lines.”  During the government’s fiscal year 2011, the U.S. Navy Military Sea Lift Command, the primary supplier of fuel and oil to the U.S. Navy fleet, delivered nearly 600 million gallons of fuel to Navy vessels underway, operating 15 fleet replenishment ships around the globe.

The Navy’s fuel supply system works at sea, while underway and is a costly endeavor in terms of logistics, time, fiscal constraints and threats to national security and the sailors at sea.

It’s a brilliantly insightful use of the environment.  Moreover the technology will help clean the seawater of an overcharge of CO2 and that is actually a recycling of fossil fuel additions to the environment.

Entrepreneurs are going to realize the Navy’s work could be an industrial boon to fuel production as well as shorten the carbon cycle.  While the Navy thinks $3 to $6 for production cost, the private sector would very likely drive that cost far further down.

It’s not hard to imagine that in a few years most of the oil business might simply be at sea, harvesting CO2 and H2, making petroleum products from a recycling of the CO2 from the past use of fossil fuels.

Many may complain that the military is a waste, poor policy, or other notions that fly in the face of human nature.  But in the past few decades the U.S. military, filled with volunteers, can make significant contributions, and now perhaps solve what has been thought to be an intractable problem.

Advertisements

Brain-Machine Interface – Avatars Of The Future, A Reality

Brain-machine interface lets monkeys move two virtual arms with minds: study

Xinhua | 2013-11-7 | Global Times

 

US researchers said Wednesday that monkeys in a lab have learned to control the movement of both arms on an avatar using just their brain activity.

The findings, published in the US journal Science Translational Medicine, advanced efforts to develop bilateral movement in brain-controlled prosthetic devices for severely paralyzed patients, said researchers at Duke University, based in Durham, the state of North Carolina.

To enable the monkeys to control two virtual arms, the researchers recorded nearly 500 neurons from multiple areas in both cerebral hemispheres of the animals’ brains, the largest number of neurons recorded and reported to date.

Millions of people worldwide suffer from sensory and motor deficits caused by spinal cord injuries. Researchers are working to develop tools to help restore their mobility and sense of touch by connecting their brains with assistive devices.

The brain-machine interface approach holds promise for reaching this goal. However, until now brain-machine interfaces could only control a single prosthetic limb.

“Bimanual movements in our daily activities — from typing on a keyboard to opening a can — are critically important,” senior author Miguel Nicolelis, professor of neurobiology at Duke University School of Medicine said in a statement. “Future brain- machine interfaces aimed at restoring mobility in humans will have to incorporate multiple limbs to greatly benefit severely paralyzed patients.”

Nicolelis and his colleagues studied large-scale cortical recordings to see if they could provide sufficient signals to brain-machine interfaces to accurately control bimanual movements.

The monkeys were trained in a virtual environment within which they viewed realistic avatar arms on a screen and were encouraged to place their virtual hands on specific targets in a bimanual motor task. The monkeys first learned to control the avatar arms using a pair of joysticks, but were able to learn to use just their brain activity to move both avatar arms without moving their own arms.

As the animals’ performance in controlling both virtual arms improved over time, the researchers observed widespread plasticity in cortical areas of their brains. These results suggested that the monkeys’ brains may incorporate the avatar arms into their internal image of their bodies, a finding recently reported by the same researchers in the journal Proceedings of the National Academy of Sciences.

The researchers also found that cortical regions showed specific patterns of neuronal electrical activity during bimanual movements that differed from the neuronal activity produced for moving each arm separately.

The study suggested that very large neuronal ensembles — not single neurons — define the underlying physiological unit of normal motor functions, the researchers said, adding that small neuronal samples of the cortex may be insufficient to control complex motor behaviors using a brain-machine interface.

“When we looked at the properties of individual neurons, or of whole populations of cortical cells, we noticed that simply summing up the neuronal activity correlated to movements of the right and left arms did not allow us to predict what the same individual neurons or neuronal populations would do when both arms were engaged together in a bimanual task,” Nicolelis said. “This finding points to an emergent brain property — a non-linear summation — for when both hands are engaged at once.”

Mutant Gene Discovery

Mutant gene discovery will help research

Xinhua | 2013-11-7 | Global Times

 

Chinese doctors have discovered and registered a new mutant gene for alpha-thalassemia, first of its kind worldwide, an advance that enriches the gene database to assist researches into cures for genetic disease.

Li Youqiong and colleagues from the People’s Hospital of Guangxi Zhuang Autonomous Region, discovered this gene, a 21.9, after a series of experiments on a carrier of the hereditary disease in 2011.

Thalassemia is a disease where the carrier is missing or has malfunctioning genes responsible for making hemoglobin, the blood protein that helps to carry oxygen around the body.

The hemoglobin molecule has subunits commonly referred to as alpha and beta.

The mutant gene was identified by the end of 2012 before it was added to the GenBank database in the US-based National Center for Biotechnology Information(NCBI) and then disclosed to public on Oct.1 2013, according to Li.

There is no effective cure for alpha-thalassemia, and the discovery of the new mutation will help prevention and research into the disease while preparing theoretical basis for future gene therapy.

There are three main genetic sequence databases worldwide, which comprises the DNA Data bank of Japan(DDBJ),the European Molecular Biology Laboratory(EMBL) and GenBank at NCBI. These three organizations exchange data on a daily basis.

Gender Imbalance Troubles China

2010-07-08 18:42 | The Economic Observer
With a severe gender imbalance among young Chinese, China is about to face a lot of problems.According to a Blue Paper on Society released by the China Academy of Social Science, because of the serious gender imbalance among Chinese under the age of 19, ten years later, tens of thousands of male Chinese of marriageable age will have difficulty finding a wife.It is not just the marriage market that will be influenced.In agricultural areas, unmarried young men over 25 years old are everywhere; in rural kindergartens and primary schools, the number of male students is obviously higher than that of females. In the Yangtze River Delta, Pearl River Delta and southeast Fujian Province where the local economies are dominated by the manufacturing and service industry, because of the severe shortage of women aged between 18 and 25, clothing factories have no choice but to hire young men.

China has entered a society where the number of men far exceeds that of women.

“China’s high sex ratio has lasted for over 20 years, its accumulated effects are becoming obvious,” Yuan Xin, a professor with Nankai University’s population and development research institute, said.

The sex ratio at birth under normal circumstances, should be 103 to 107 male infants for every 100 female babies. Because the death ratio of baby boys is higher than that of girls, the number of boys and girls will be close to equal when they are reach the age of marriage.

But in China, the sex ratio has been increasing since the 1980s. In 1982 when China conducted its third national population census, the number of male births for every 100 females was 108.47; in 1990, it rose to 111; in 2000, it was 119 and in 2005, it jumped to 120.49, 13. The male population at that point was 13 percent higher than that of females.

“In a short period of over 20 years, the gender imbalance has expanded quickly from eastern provinces to western, from rural areas to urban cities. Now it has almost covered the whole country,” Yuan Xin said. In 1982, only 18 provinces had a relatively high sex ratio while in 2005, all provinces, except Tibet, had a high sex ratio and three provinces had a ratio exceeding 130.

The gender imbalance will not only produce a large number of single young men, but also will give rise to a series of social problems.

Based on statistics provided by the National Bureau of Statistics, with the size of the male population aged zero to 19 being 23 million more than that of the female population, in the next ten years, every year there will be 1.2 million more men reaching marriageable age than women, forcing the former to seek wives in less-developed regions or search for younger females. The final result will be that young men in poor areas will be edged out of the marriage market, which, according to Tian Xueyuan who is the deputy director of the China Population Association, will give rise to a black market of “wife selling” and thus threaten social stability.

In recent years, 36,000 women have been sold and sent to Zhejiang Province to marry local men, statistics from the local public security bureau show. Most of these women are from underdeveloped regions like Yunnan, Guizhou, Sichuan and Hubei.

In the mountainous area connecting Guangxi Province and Vietnam where the economy is poor, men are forced to marry brides who have illegally entered China from Vietnam.

“The narrowing of the marriage market has produced a large number of single men. What is worse, it is the impoverished who are bearing the consequences,” Tian Xueyuan said.

The gender imbalance will also give a heavy blow to the job market. A textile factory owner, Yuan Xin, who is doing business in Guangzhou, Hangzhou and Shanghai said, said the sex ratio in many textile factories has reached up to four to six males per one female; some factories have even closed due to a lack of female laborers. Yaun Xin said that excess of male laborers would intensify the competition in the job market and make it even more difficult for women to find jobs. Additionally, because of the shortage of females, in some sectors, men would have to take positions which formerly belonged to women, while in some other sectors, men would face more severe competition.

What has caused such an unbalanced sex ratio? The answer is multi-faceted.

One answer is the advanced technology which allows people to know the sex of fetuses when a woman is only four-months pregnant or even less. Male fetuses will kept alive while female fetuses will be aborted.

The technology, called type-B ultrasonic, though prohibited by Chinese laws to be used on pregnant women, is still available in some clinics in Chinese cities, towns and villages, especially in some villages surrounded by cities.

Those clinics, always disguised to be lawful outpatient hospitals or pharmaceutical stores, will inspect the sex of the fetus through a B-type ultrasonic ultrasound and if it is a female, they will ask a doctor, who works for a local hospital and wants to earn extra money, to perform an abortion.

But that is not the complete answer.

“The core of the problem lies in the traditional view which holds that men more important than women,” Tian Xueyuan said.

Though the Chinese government has made it clear that women are equal to men under law, many Chinese parents and families still consider men more important than women and boys better than girls because men are more capable of supporting families and will continue the family line.

According to Yang Juhua, a professor with Renmin University, the unequal social status between male and females is still obvious in Chinese society. Aside from education levels, women are still suffering from disadvantages in many fields. Their wages are still lower than that of men in same-level positions and they are more likely to be refused when competing for university acceptance or job vacancies with male peers with the same qualifications. Additionally, Chinese women play a much weaker role in state affairs than their foreign counterparts. Females only account for one fifth of the total officials in government, party organizations and public agencies,

Edited by Rose Scobie | Original Source People\’s Daily

 

3-D Tools & Avionics Manufacturing

By Graham Warwick
Source: Aviation Week & Space Technology

Virtual reality has become a commonplace engineering tool for major aerospace manufacturers, where three-dimensional visualization systems are routinely used to aid design reviews.

But further down the supply chain, simulation environments into which designers can immerse themselves to navigate a structure or walk a cabin are too expensive—and unnecessary if what the company produces fits on a desktop, or in the hand of an engineer.

Avionics manufacturer Rockwell Collins decided to develop its own low-cost 3-D visualization system, initially to perform virtually what previously was done physically: to visually inspect new hardware designs to assess their manufacturability.

The company’s goal in developing the Virtual Product Model (VPM) was to find manufacturing problems earlier in the design cycle, when new avionics boxes are still on the computer screen and before expensive prototypes have been produced.

“3-D virtual reality has been used at the prime level for over a decade, and we recognize its power for communicating and understanding designs and the impact of designs,” says Jim Lorenz, manager of advanced industrial engineering. “Large-scale fully immersive systems are appropriate at the platform level, but at the box level, on a tabletop, their expense is outside what we could deal with.”

Rockwell Collins’s solution was to find commercial software that could be tailored to provide a low-cost way to take product data from its computer-aided design (CAD) system, convert it to 3-D and put it into a virtual environment “without specialist skills or vast expense,” says Kevin Fischer, manager of manufacturing technology pursuits.

Using 3-D glasses and a motion-capture system, an engineer can manipulate the virtual model of an avionics box, inspecting it from all angles to make sure it can be manufactured in the factory or repaired in the field. Several people can view the 3-D model collaboratively during a design review, or it can be sent to individual engineers and viewed in 2-D format on desktop workstations.

“We take the CAD model into the VPM and put it in a format that does not need the software to run. We send an executable file, the engineers open it, inspect the model and determine what its manufacturability is by looking at it,” Fischer says.

The basic requirement is to perform virtually—via 3-D models–the manufacturability assessments previously conducted manually using physical prototypes. And “there are some unique things the system can do,” he says. These include an “augmented reality” mode that allows the user to change the 3-D model’s scale “and go between the circuit cards to see things we can’t catch physically.”

In augmented reality, the user’s hand as represented in the virtual environment, its motion captured by cameras, can be varied in size from that of a large man to that of a small woman to help uncover potential accessibility problems.

The VPM system is now in day-to-day use with new designs. A “couple of hundred” designs have gone through the process and Rockwell Collins puts the return on its investment at 800% in terms of the number of hours required to fix manufacturability issues discovered virtually in the 3-D model versus physically in a hardware prototype.

Although the CAD data is reduced in resolution when it is converted to a 3-D model for visualization, “we have yet to run into a [manufacturability] problem [in the model] and there not turn out to be a correspondingly real problem [in the hardware],” says Lorenz.

Expanding the capability is next on the agenda. One direction is to take the now-manual assessment process and automate it by bringing in rules-based analysis software. “We are starting to think about how to take the capability to visually inspect a design and apply appropriate rules to get a level of automation where we find things we don’t catch by manual inspection,” says Fischer.

Another direction is to pull more data into the visualization environment for use during design reviews, “information such as cost at the piece-part level, so we can see the implications of design decisions,” says Lorenz. “We are also doing some work at the conceptual design level. We would like to use VPM two or three times during the design cycle, but we are not there yet.”

The company also is looking at using VPM as a basis for developing 3-D work instructions for use on the factory floor, and for the technical documents used by field service representatives to troubleshoot problems. “Their key interest is getting down to the circuit-card level, while [in manufacturing] we work with boxes,” says Fischer.

Rockwell Collins also would like to expand the VPM beyond mechanical CAD data. “We want to do electrical, et cetera, in the same environment by pulling together various types of models,” says Fischer. “Anything you can do in PowerPoint, this can do better. But we need to beef up the electrical CAD side of the equation.”

Next Generation Jammer

By Graham Warwick  graham.warwick@aviationweek.com
Source: AWIN First
July 08, 2013                       Credit: Boeing

Raytheon has been selected to develop the Next Generation Jammer (NGJ) pod to replace the ALQ-99 tactical jamming system now carried by U.S Navy Boeing EA-18G Growler electronic-attack aircraft.

The company has been awarded a $279.4 million contract for the 22-month technology development phase of the program. NGJ is planned to become operational in 2020, providing increased jamming agility and precision and expanded broadband capability for greater threat coverage.

Raytheon was one of four contractors involved in the 33-month technology maturation phase of the NGJ program. The others were BAE Systems, ITT Exelis and Northrop Grumman, but the Defense Department contract announcement says only three bids were received.

Under the TD phase, Raytheon will “design and build critical technologies that will be the foundational blocks of NGJ,” says Naval Air Systems Command. The complete system will be flight tested on the EA-18G in the follow-on, 54-month engineering and manufacturing development phase.

Raytheon confirms receipt of the award and says it offered “an innovative, next-generation solution that meets current customer requirements and potential future needs.” All the competitors based their designs for the NGJ pod on active, electronically scanned array jammer antennas.

Replacing the Organ Donor

Lab-grown human cells used to recreate liver functions, hope to replace the organ donor

By Jacob Kastrenakes   on July   3, 2013 04:02 pm    |     Email@jake_k

Using small pieces of human liver that were grown from stem cells, a team led by researchers at Japan’s Yokohama City University was able to significantly restore liver function in mice through only a simple transplant — and they hope to eventually use the same method to save human lives. The team took tiny, lab-grown “liver buds” and inserted them into mice, where within two days the cells hooked into surrounding blood vessels and began performing natural functions of the liver. Though the team has yet to track the long-term health of the mice following the procedure, Nature reports that the animals remained alive and well despite prior liver issues.

Having only been demonstrated on mice, the method is still being considered a proof of concept. But the hope is that its immediately promising results can soon be applied to regenerative medicine. The short supply of liver donors has made growing replacements a high priority for interested researchers, but the Yokohama team’s work — which was published today in Nature — remains a preliminary step toward that goal: one of the team’s leaders told Nature that testing the process in humans is still years away. Among the biggest hurdles is simply the difficulty of growing enough cells to actually test them in human patients.

Flash Drive

June 17, 2013

Object of Interest: The Flash Drive

Posted by

Corbis-42-21082844-580.jpg

When Daniel Ellsberg decided to copy the Pentagon Papers, in 1969, he secretly reproduced them, page by page, with a photocopier. The process of duplication was slow; every complete copy of the material spanned seven thousand pages. When Edward Snowden decided to leak details of surveillance programs conducted by the National Security Agency, he was able to simply slip hundreds of documents into his pocket; the government believes that Snowden secreted them away on a small device no bigger than a pinkie finger: a flash drive.

The flash drive’s compact size, ever-increasing storage capacity, and ability to interface with any computer that has a universal-serial-bus port—which is, essentially, every computer—makes it an ideal device for covertly copying data or uploading malicious software onto computer systems. They are, consequently, an ongoing security concern. The devices are reportedly banned from the N.S.A.’s facilities; a former N.S.A. official told the Los Angeles Times that “special permission” is required to use them. Even then, the official said, “people always look at you funny.” In the magazine, Seymour Hersh reported that an incident involving a USB drive resulted in some N.S.A. unit commanders ordering “all ports on the computers on their bases to be sealed with liquid cement.”

USB flash drives are perhaps the purest form of two distinct pieces of technology: flash memory and the universal serial bus. Flash memory was invented at Toshiba in the nineteen-eighties. According to Toshiba’s timeline, the NAND variant of flash memory, which is the kind now used for storage in myriad devices, like smartphones and flash drives, was invented in 1987. The technology, which stores data in memory cells, remained incredibly expensive for well over a decade, costing hundreds of dollars per megabyte in the early to mid-nineteen-nineties. The universal serial bus was developed in the mid-nineties by a coalition of technology companies to simplify connecting devices to computers through a single, standardized port. By the end of the decade, flash memory had become inexpensive enough to begin to make its way into consumer devices, while USB succeeded in becoming a truly universal computer interface.

The first patent for a “USB-based PC flash disk” was filed in April, 1999, by the Israeli company M-Systems (which no longer exists—it was acquired by SanDisk in 2006). Later that same year, I.B.M. filed an invention disclosure by one of its employees, Shimon Shmueli, who continues to claim that he invented the USB flash drive. Trek 2000 International, a Singaporean company, was the first to actually sell a USB flash drive, which it called the ThumbDrive, in early 2000. (It won the trademark for ThumbDrive, which has come to be a generic term for the devices, only a few years ago.) Later that year, I.B.M. was the first to sell the devices in the U.S. The drive, produced by M-Systems, was called the DiskOnKey. The first model held just eight megabytes. The timing was nonetheless fortuitous: 1.44-megabyte floppy disks had long been unable to cope with expanding file sizes, and even the most popular souped-up replacement, the Zip drive, failed to truly succeed it. Optical media, despite storing large amounts of data, remained relatively inconvenient; recording data was time consuming, re-recording it even more so.

Improved manufacturing technologies have simultaneously increased flash drives’ capacity while decreasing their cost. The most popular flash drive on Amazon stores thirty-two gigabytes and costs just twenty-five dollars, while a flash drive recently announced by Kingston can hold one terabyte of data—enough for thousands of hours of audio, or well over a hundred million pages of documents—and transfer that data at speeds of a hundred and sixty to two hundred and forty megabytes per second. Few things come to mind that store more information in less space—a black hole, for instance.

More critically, as convenience drives people to share more and more information across networks, rather than through meatspace—why back up data on a spare hard drive when you can store it in the cloud for cents on the gigabyte, or burn a movie to a disc for a friend when you can share it via Dropbox?—flash drives are a convenient means of transporting large quantities of information off the grid. (Getting that data onto the flash drive in the first place may be another matter, though.) Carrying a flash drive in your pocket on the subway does not produce network traffic or metadata that can later be analyzed.

Flash drives have even been used to create a new form of a dead drop in cities around the country: the drives are embedded into walls or other public spaces, and users simply plug their device into the exposed USB port to download or upload data. Though these dead drops are largely a kind of performance art, the intent is to allow people to anonymously share data without passing it over a network—a proposition that is only growing more rarefied.

It seems certain that there will be more Daniel Ellsbergs and Edward Snowdens, and almost as certain that flash drives will be a tool they use to secretly copy and abscond with the information they need—at least until something that is even more discreet, secure, and convenient arrives.

Nitric Oxide – Build Muscle

The Surprising New Tricks Pros Are Using to Build Muscle

Reading about sports these days, we are constantly bombarded with news of top notch athletes being exposed for using illegal steroids.

Steroid use involves huge costs, legal issues, and above all, potential health problems. With such risks, you wonder why anyone would be tempted to go this route.

Fortunately, steroid use may eventually be a thing of the past.  That’s because medical researchers studying how the human body builds muscle and endurance are developing safe and legal substances which can increase the body’s ability to build muscle, without the health risks associated with steroids.

One of the most interesting fields of research surrounds a naturally occurring chemical compound called nitric oxide. Nitric oxide is a vasodilator, which means it helps move oxygen to the muscles when they need it most. Increased nitric oxide in the blood stream signals the blood vessel walls to relax, which allows more blood to flow to the body’s muscles, thus delivering more oxygen and nutrients throughout the body.

It’s been shown to lead to:

  • Drastic Muscle Gains
  • Increased Blood Flow and Oxygen Delivery
  • Boosted Strength, Endurance, and Power
  • Support for Your Immune System
  • Immediate Results
  • Total Body Transformation

While the body naturally increases nitric oxide during workouts, it’s only a limited amount and researchers have been focused on artificially increasing your nitric oxide levels.

One of the most successful products that has emerged from this research is called Factor 2.  It uses “arginines,” special amino acids specifically linked to nitric oxide production to significantly increase oxygen and nutrient flow to the muscles during workouts.  As a result, it can safely spark powerful muscle growth, muscle definition, and strength.

Factor 2 produces noticeable results by maximizing your muscle gains as you power through your workouts and within a few weeks, users are starting to notice additional muscle definition and strength.

Factor 2 is now the recognized leader in nitric oxide stimulation and legal, safe muscle and strength enhancement.  It was Bodybuilding.com’s Best New Brand of the Year (2011) and pro athletes are taking note.

Athletes like professional football player Vernon Davis have discovered the dramatic benefits of using a nitric oxide supplement. Davis has been an advocate of Factor 2 since first taking it, telling his teammates in San Francisco, “Factor 2 has proven results. I believe in results.”

Zero

Who Invented Zero?

Jessie Szalay, LiveScience Contributor
Date: 12 March 2013 Time: 06:22 PM ET

“The concept of zero, both as a placeholder and as a symbol for nothing, is a relatively recent development.”

Though humans have always understood the concept of nothing or having nothing, the concept of zero is relatively new — it only fully developed in the fifth century A.D. Before then, mathematicians struggled to perform the simplest arithmetic calculations. Today, zero — both as a symbol (or numeral) and a concept meaning the absence of any quantity — allows us to perform calculus, do complicated equations, and to have invented computers.Early history: Angled wedges

Zero was invented independently by the Babylonians, Mayans and Indians (although some researchers say the Indian number system was influenced by the Babylonians). The Babylonians got their number system from the Sumerians, the first people in the world to develop a counting system. Developed 4,000 to 5,000 years ago, the Sumerian system was positional — the value of a symbol depended on its position relative to other symbols. Robert Kaplan, author of “The Nothing That Is: A Natural History of Zero, suggests that an ancestor to the placeholder zero may have been a pair of angled wedges used to represent an empty number column. However, Charles Seife, author of “Zero: The Biography of a Dangerous Idea,” disagrees that the wedges represented a placeholder.

The Sumerians’ system passed through the Akkadian Empire to the Babylonians around 300 B.C. There, scholars agree, a symbol appeared that was clearly a placeholder — a way to tell 10 from 100 or to signify that in the number 2,025, there is no number in the hundreds column. Initially, the Babylonians left an empty space in their cuneiform number system, but when that became confusing, they added a symbol — double angled wedges — to represent the empty column. However, they never developed the idea of zero as a number.

Zero in the Americas

Six hundred years later and 12,000 miles from Babylon, the Mayans developed zero as a placeholder around A.D. 350 and used it to denote a placeholder in their elaborate calendar systems. Despite being highly skilled mathematicians, the Mayans never used zero in equations, however. Kaplan describes the Mayan invention of zero as the “most striking example of the zero being devised wholly from scratch.”

India: Where zero became a number

Some scholars assert that the Babylonian concept wove its way down to India, but others give the Indians credit for developing zero independently.

The concept of zero first appeared in India around A.D. 458. Mathematical equations were spelled out or spoken in poetry or chants rather than symbols. Different words symbolized zero, or nothing, such as “void,” “sky” or “space.” In 628, a Hindu astronomer and mathematician named Brahmagupta developed a symbol for zero — a dot underneath numbers. He also developed mathematical operations using zero, wrote rules for reaching zero through addition and subtraction, and the results of using zero in equations. This was the first time in the world that zero was recognized as a number of its own, as both an idea and a symbol.

From the Middle East to Wall Street

Over the next few centuries, the concept of zero caught on in China and the Middle East. According to Nils-Bertil Wallin of YaleGlobal, by A.D. 773, zero reached Baghdad where it became part of the Arabic number system, which is based upon the Indian system.

A Persian mathematician, Mohammed ibn-Musa al-Khowarizmi, suggested that a little circle should be used in calculations if no number appeared in the tens place. The Arabs called this circle “sifr,” or “empty.” Zero was crucial to al-Khowarizmi, who used it to invent algebra in the ninth century. Al-Khowarizmi also developed quick methods for multiplying and dividing numbers, which are known as algorithms — a corruption of his name.

Zero found its way to Europe through the Moorish conquest of Spain and was further developed by Italian mathematician Fibonacci, who used it to do equations without an abacus, then the most prevalent tool for doing arithmetic. This development was highly popular among merchants, who used Fibonacci’s equations involving zero to balance their books.

Wallin points out that the Italian government was suspicious of Arabic numbers and outlawed the use of zero. Merchants continued to use it illegally and secretively, and the Arabic word for zero, “sifr,” brought about the word “cipher,” which not only means a numeric character, but also came to mean “code.”

By the 1600s, zero was used fairly widely throughout Europe. It was fundamental in Rene Descartes’ Cartesian coordinate system and in Sir Isaac Newton’s and Gottfried Wilhem Liebniz’s developments of calculus. Calculus paved the way for physics, engineering, computers, and much of financial and economic theory.