Gender Imbalance Troubles China

2010-07-08 18:42 | The Economic Observer
With a severe gender imbalance among young Chinese, China is about to face a lot of problems.According to a Blue Paper on Society released by the China Academy of Social Science, because of the serious gender imbalance among Chinese under the age of 19, ten years later, tens of thousands of male Chinese of marriageable age will have difficulty finding a wife.It is not just the marriage market that will be influenced.In agricultural areas, unmarried young men over 25 years old are everywhere; in rural kindergartens and primary schools, the number of male students is obviously higher than that of females. In the Yangtze River Delta, Pearl River Delta and southeast Fujian Province where the local economies are dominated by the manufacturing and service industry, because of the severe shortage of women aged between 18 and 25, clothing factories have no choice but to hire young men.

China has entered a society where the number of men far exceeds that of women.

“China’s high sex ratio has lasted for over 20 years, its accumulated effects are becoming obvious,” Yuan Xin, a professor with Nankai University’s population and development research institute, said.

The sex ratio at birth under normal circumstances, should be 103 to 107 male infants for every 100 female babies. Because the death ratio of baby boys is higher than that of girls, the number of boys and girls will be close to equal when they are reach the age of marriage.

But in China, the sex ratio has been increasing since the 1980s. In 1982 when China conducted its third national population census, the number of male births for every 100 females was 108.47; in 1990, it rose to 111; in 2000, it was 119 and in 2005, it jumped to 120.49, 13. The male population at that point was 13 percent higher than that of females.

“In a short period of over 20 years, the gender imbalance has expanded quickly from eastern provinces to western, from rural areas to urban cities. Now it has almost covered the whole country,” Yuan Xin said. In 1982, only 18 provinces had a relatively high sex ratio while in 2005, all provinces, except Tibet, had a high sex ratio and three provinces had a ratio exceeding 130.

The gender imbalance will not only produce a large number of single young men, but also will give rise to a series of social problems.

Based on statistics provided by the National Bureau of Statistics, with the size of the male population aged zero to 19 being 23 million more than that of the female population, in the next ten years, every year there will be 1.2 million more men reaching marriageable age than women, forcing the former to seek wives in less-developed regions or search for younger females. The final result will be that young men in poor areas will be edged out of the marriage market, which, according to Tian Xueyuan who is the deputy director of the China Population Association, will give rise to a black market of “wife selling” and thus threaten social stability.

In recent years, 36,000 women have been sold and sent to Zhejiang Province to marry local men, statistics from the local public security bureau show. Most of these women are from underdeveloped regions like Yunnan, Guizhou, Sichuan and Hubei.

In the mountainous area connecting Guangxi Province and Vietnam where the economy is poor, men are forced to marry brides who have illegally entered China from Vietnam.

“The narrowing of the marriage market has produced a large number of single men. What is worse, it is the impoverished who are bearing the consequences,” Tian Xueyuan said.

The gender imbalance will also give a heavy blow to the job market. A textile factory owner, Yuan Xin, who is doing business in Guangzhou, Hangzhou and Shanghai said, said the sex ratio in many textile factories has reached up to four to six males per one female; some factories have even closed due to a lack of female laborers. Yaun Xin said that excess of male laborers would intensify the competition in the job market and make it even more difficult for women to find jobs. Additionally, because of the shortage of females, in some sectors, men would have to take positions which formerly belonged to women, while in some other sectors, men would face more severe competition.

What has caused such an unbalanced sex ratio? The answer is multi-faceted.

One answer is the advanced technology which allows people to know the sex of fetuses when a woman is only four-months pregnant or even less. Male fetuses will kept alive while female fetuses will be aborted.

The technology, called type-B ultrasonic, though prohibited by Chinese laws to be used on pregnant women, is still available in some clinics in Chinese cities, towns and villages, especially in some villages surrounded by cities.

Those clinics, always disguised to be lawful outpatient hospitals or pharmaceutical stores, will inspect the sex of the fetus through a B-type ultrasonic ultrasound and if it is a female, they will ask a doctor, who works for a local hospital and wants to earn extra money, to perform an abortion.

But that is not the complete answer.

“The core of the problem lies in the traditional view which holds that men more important than women,” Tian Xueyuan said.

Though the Chinese government has made it clear that women are equal to men under law, many Chinese parents and families still consider men more important than women and boys better than girls because men are more capable of supporting families and will continue the family line.

According to Yang Juhua, a professor with Renmin University, the unequal social status between male and females is still obvious in Chinese society. Aside from education levels, women are still suffering from disadvantages in many fields. Their wages are still lower than that of men in same-level positions and they are more likely to be refused when competing for university acceptance or job vacancies with male peers with the same qualifications. Additionally, Chinese women play a much weaker role in state affairs than their foreign counterparts. Females only account for one fifth of the total officials in government, party organizations and public agencies,

Edited by Rose Scobie | Original Source People\’s Daily

 

Advertisements

7 Technology Trends for 2014

The Top 7 Technology Trends That Will Dominate 2014

Jayson DeMers

Strap yourself in, it’s going to be a wild ride. In considering the changes we’ve seen in technology over the past year, I’m bracing myself for unprecedented growth when it comes to anytime, anywhere, on-demand information and entertainment.

Based on the trends we’ve seen so far in 2013, I predict 2014 will see many fledgling technologies mature and grow beyond what we could have imagined just a few years ago.

So without further ado, here are my top 7 predictions for technology trends that will dominate 2014.

1. Consumers will come to expect Smart TV capabilities

With Smart TV shipments expected to reach 123 million in 2014 – up from about 84 million in 2012 – we are poised to see explosive growth in this industry.

In the midst of this growth, we will continue to see fierce competition between major players like Samsung, Panasonic, and LG. Prices will need to continue to drop, as more consumers crave, and even expect, the ability to use Netflix, Hulu, Amazon Instant Video and their web browser via their TV.

Of course, the development we’re all waiting for in 2014 is the release of Apple’s much anticipated iTV. It appears the iTV is now in the early development stage, and that Apple may be in the process of making a deal with Time Warner to facilitate programming on Apple devices.

The device is rumoured to include iCloud sync, the ability to control your iPhone, and ultra HD LCD panels. Keep an eye out for this release as early as summer 2014.

2. Smart watches will become ‘smarter’

Rather than having to pull out your smartphone or tablet for frequent email, text and social media updates, you’ll glance at your watch.

2014 is the year to keep an eye out for the Google watch. Rumor has it the device will integrate with Google Now, which aims to seamlessly provide relevant information when and where you want it (and before you’d asked for it).

We’ll see smart watches become even smarter, learning what news and updates are important to us, when we want to receive them, and responding more accurately to voice controls.

For smart watches to succeed, they’ll need to offer us something that our smart phone can’t; whether this means more intuitive notifications, or the ability to learn from our daily activities and behaviours (for instance, heart rate monitoring), it will be interesting to see.

3. Google Glass will still be in “wait and see” mode

While Google Glass hasn’t yet been released to the general public, we’ve heard enough about it to know it’s still very early days for this technology. With an estimated 60,000 units expected to sell in 2013, and a predicted several million in 2014, it’s still a long way from becoming a common household technology.

These augmented reality glasses allow you to access information like email and texts, take hands-free pictures and videos, effortlessly translate your voice, and even receive overlaid walking, cycling or driving directions, right within your field of vision.

It’s predicted that both Google Glass 2.0, and its companion, the Glass App Store, should be released to the general public sometime in 2014.

Be on the lookout for competition in this market, particularly from major players like Samsung. I predict we’ll see much of this competition aimed at niche markets like sports and healthcare.

4. Other applications and uses for Apple’s TouchID will emerge

The release of the iPhone 5S has, for the first time, made on-the-go fingerprint security a reality. The potential for Touch ID technology to really take off is, I believe, an inevitable reality. Touch ID, which uses a high-resolution camera to scan your fingerprint, allows convenient ultra-security for your iPhone.

Currently, the technology is limited; the only real uses are unlocking your iPhone, and making purchases in the App store. I predict that we’ll see this technology incorporated into other Apple products soon. I think we’ll even see TouchId integrated into MacBook products later this year or next.

I also predict TouchID, though not quite bug-free, will be used for other purposes, such as to securely integrate with home security systems, access password software, and even pay for groceries (more on that in an upcoming article).

5. Xbox One and PS4 will blur the lines between entertainment and video gaming

The new gaming consoles (Xbox One and PS4) will increasingly integrate social media-like connectivity between players. Players could have followers, work together to achieve in-game goals, and new technology will allow for equally-skilled players to compete.

The PS4, slated to be released November 15th, will track both the controller and the player’s face and movements for more intuitive play.

Apart from great gaming, these systems will allow for a far more integrative entertainment experience. For instance, rather than switching between TV, gaming, music and sports, you’ll be able to do two or even three activities side-by-side, or by easily switching back and forth.

6. 3D Printing will begin to revolutionize production

We’ve seen a huge rise in the popularity of 3D printing this year, coupled with a dramatic fall in pricing. The ability to easily create multi-layered products that are actually usable – well, that’s pretty amazing.

I’ll be watching for a movement towards simple products being produced close to home, and to greater customization given the ease of manufacturing. I think it’s inevitable that manufacturing in countries such as China will become less appealing and lucrative for businesses given the high costs of shipping and managing overseas contracts.

I don’t expect these changes to reach their full effect in 2014, however I believe businesses will be starting to consider how this will affect their production plans for 2015 and beyond.

7. The movement towards natural language search will make search more accurate and intuitive

There was a time when we used terms like “personal digital assistant” to describe a hand-held calendar. Oh, how times have changed.

With the emergence of intelligent personal assistants like Google Now and Apple’s Siri, the goal is to have information intuitively delivered to you, often before you even ask for it. The shift seems to be away from having to actively request data, and instead to have it passively delivered to your device.

Natural language search will continue to overtake keyword-based search, as seen by Google’s move towards longer, more natural searches in its recent release of Hummingbird, Google’s largest algorithm update thus far.

 

3-D Tools & Avionics Manufacturing

By Graham Warwick
Source: Aviation Week & Space Technology

Virtual reality has become a commonplace engineering tool for major aerospace manufacturers, where three-dimensional visualization systems are routinely used to aid design reviews.

But further down the supply chain, simulation environments into which designers can immerse themselves to navigate a structure or walk a cabin are too expensive—and unnecessary if what the company produces fits on a desktop, or in the hand of an engineer.

Avionics manufacturer Rockwell Collins decided to develop its own low-cost 3-D visualization system, initially to perform virtually what previously was done physically: to visually inspect new hardware designs to assess their manufacturability.

The company’s goal in developing the Virtual Product Model (VPM) was to find manufacturing problems earlier in the design cycle, when new avionics boxes are still on the computer screen and before expensive prototypes have been produced.

“3-D virtual reality has been used at the prime level for over a decade, and we recognize its power for communicating and understanding designs and the impact of designs,” says Jim Lorenz, manager of advanced industrial engineering. “Large-scale fully immersive systems are appropriate at the platform level, but at the box level, on a tabletop, their expense is outside what we could deal with.”

Rockwell Collins’s solution was to find commercial software that could be tailored to provide a low-cost way to take product data from its computer-aided design (CAD) system, convert it to 3-D and put it into a virtual environment “without specialist skills or vast expense,” says Kevin Fischer, manager of manufacturing technology pursuits.

Using 3-D glasses and a motion-capture system, an engineer can manipulate the virtual model of an avionics box, inspecting it from all angles to make sure it can be manufactured in the factory or repaired in the field. Several people can view the 3-D model collaboratively during a design review, or it can be sent to individual engineers and viewed in 2-D format on desktop workstations.

“We take the CAD model into the VPM and put it in a format that does not need the software to run. We send an executable file, the engineers open it, inspect the model and determine what its manufacturability is by looking at it,” Fischer says.

The basic requirement is to perform virtually—via 3-D models–the manufacturability assessments previously conducted manually using physical prototypes. And “there are some unique things the system can do,” he says. These include an “augmented reality” mode that allows the user to change the 3-D model’s scale “and go between the circuit cards to see things we can’t catch physically.”

In augmented reality, the user’s hand as represented in the virtual environment, its motion captured by cameras, can be varied in size from that of a large man to that of a small woman to help uncover potential accessibility problems.

The VPM system is now in day-to-day use with new designs. A “couple of hundred” designs have gone through the process and Rockwell Collins puts the return on its investment at 800% in terms of the number of hours required to fix manufacturability issues discovered virtually in the 3-D model versus physically in a hardware prototype.

Although the CAD data is reduced in resolution when it is converted to a 3-D model for visualization, “we have yet to run into a [manufacturability] problem [in the model] and there not turn out to be a correspondingly real problem [in the hardware],” says Lorenz.

Expanding the capability is next on the agenda. One direction is to take the now-manual assessment process and automate it by bringing in rules-based analysis software. “We are starting to think about how to take the capability to visually inspect a design and apply appropriate rules to get a level of automation where we find things we don’t catch by manual inspection,” says Fischer.

Another direction is to pull more data into the visualization environment for use during design reviews, “information such as cost at the piece-part level, so we can see the implications of design decisions,” says Lorenz. “We are also doing some work at the conceptual design level. We would like to use VPM two or three times during the design cycle, but we are not there yet.”

The company also is looking at using VPM as a basis for developing 3-D work instructions for use on the factory floor, and for the technical documents used by field service representatives to troubleshoot problems. “Their key interest is getting down to the circuit-card level, while [in manufacturing] we work with boxes,” says Fischer.

Rockwell Collins also would like to expand the VPM beyond mechanical CAD data. “We want to do electrical, et cetera, in the same environment by pulling together various types of models,” says Fischer. “Anything you can do in PowerPoint, this can do better. But we need to beef up the electrical CAD side of the equation.”

Next Generation Jammer

By Graham Warwick  graham.warwick@aviationweek.com
Source: AWIN First
July 08, 2013                       Credit: Boeing

Raytheon has been selected to develop the Next Generation Jammer (NGJ) pod to replace the ALQ-99 tactical jamming system now carried by U.S Navy Boeing EA-18G Growler electronic-attack aircraft.

The company has been awarded a $279.4 million contract for the 22-month technology development phase of the program. NGJ is planned to become operational in 2020, providing increased jamming agility and precision and expanded broadband capability for greater threat coverage.

Raytheon was one of four contractors involved in the 33-month technology maturation phase of the NGJ program. The others were BAE Systems, ITT Exelis and Northrop Grumman, but the Defense Department contract announcement says only three bids were received.

Under the TD phase, Raytheon will “design and build critical technologies that will be the foundational blocks of NGJ,” says Naval Air Systems Command. The complete system will be flight tested on the EA-18G in the follow-on, 54-month engineering and manufacturing development phase.

Raytheon confirms receipt of the award and says it offered “an innovative, next-generation solution that meets current customer requirements and potential future needs.” All the competitors based their designs for the NGJ pod on active, electronically scanned array jammer antennas.

Why the Cloud is Winning

Here are another 51 million reasons why the cloud is winning

Summary: Commodity cloud services are delivering savings that put prices charged by large systems integrators to shame, according to the UK’s tech chief.

By | July  4, 2013 — 08:36 GMT (01:36 PDT)

Faced with a £52m bill from a large IT vendor for hosting “a major programme” the UK government decided to turn to commodity cloud services.

The result? It picked up a comparable service from a smaller player for £942,000.

“In the world of the cloud the services I get from a major systems integrator and from a minor systems integrator are relatively comparable, given the security and ability to host is often specced out anyway,” UK government CTO Liam Maxwell told The Economist’s CIO Forum in London yesterday.

The UK government plans to use commodity cloud services to help free itself from the stranglehold of a small number of systems integrators that traditionally carried out about 80 percent of government IT work, and charged huge sums of money for doing so.

Departments are being encouraged to buy cloud services from the government-run CloudStore — an online catalogue of thousands of SaaS, PaaS and IaaS and specialist cloud services available to public sector bodies — which are sourced by Whitehall through its G-Cloud procurement framework.

The idea of the CloudStore is to provide a platform where it is as easy for small and medium sized businesses to sell to government as the large vendors. The government has simplified the accreditation process to become a supplier to government and the vendors selling through the store range from multi-national corporates to start-ups.

While more than 60 percent of the spend through the CloudStore has been with SMEs since it launched last year, larger deals through the store are still going to big companies, with IBM picking up a £1.2m deal with the Home Office in May.

Spend on G-Cloud services is growing rapidly, passing £25m in May, but is still tiny compared to an estimated annual public sector IT spend of £16bn. However this could pick up even more sharply as long-term contracts with large systems integrators expire.

“The majority of the large contracts finish by 2014-15, so there’s an enormous amount of change underway at the moment,” said Maxwell.

“We’re not going to replace, we’re going to base our services around user need, and in many cases that means not doing the same thing again.”

The UK’s Office of Fair Trading today called for suppliers and purchasers in the UK public sector to contact them with their experiences on how easy it is for smaller vendors to supply to government and barriers put in place by larger players to prevent government switching to competitors.

Earlier this year the government’s director of the G-Cloud programme Denise McDonagh said systems integrators are slashing what they charge Whitehall departments in an effort to stop them from switching to cloud services.

Maxwell has plenty of government IT horror stories of his own, telling the conference it historically cost government £723 to process each payment claim made by farmers to the Rural Payments Agency.

“It would be cheaper to rent a taxi put the cash in the taxi, drive the taxi to the farm and keep a manual record than it would have been the way the outsource contract worked,” he said.

 

Replacing the Organ Donor

Lab-grown human cells used to recreate liver functions, hope to replace the organ donor

By Jacob Kastrenakes   on July   3, 2013 04:02 pm    |     Email@jake_k

Using small pieces of human liver that were grown from stem cells, a team led by researchers at Japan’s Yokohama City University was able to significantly restore liver function in mice through only a simple transplant — and they hope to eventually use the same method to save human lives. The team took tiny, lab-grown “liver buds” and inserted them into mice, where within two days the cells hooked into surrounding blood vessels and began performing natural functions of the liver. Though the team has yet to track the long-term health of the mice following the procedure, Nature reports that the animals remained alive and well despite prior liver issues.

Having only been demonstrated on mice, the method is still being considered a proof of concept. But the hope is that its immediately promising results can soon be applied to regenerative medicine. The short supply of liver donors has made growing replacements a high priority for interested researchers, but the Yokohama team’s work — which was published today in Nature — remains a preliminary step toward that goal: one of the team’s leaders told Nature that testing the process in humans is still years away. Among the biggest hurdles is simply the difficulty of growing enough cells to actually test them in human patients.

Remotely Controlling Robots

Astronaut aboard the ISS successfully controls a robot on earth for the first time

By Nathan Ingraham  on July   3, 2013 04:35 pm  |  Email@NateIngraham

international space station

NASA has completed the first successful test in which an astronaut aboard the International Space Station was able to control a robot more than 400 miles away back on the surface of the Earth. According to Space.com, the June 17th test marks the first time astronauts were able to control a robot on Earth, an advancement that will hopefully pave the way for similar control over robots deployed on Mars or the moon. The simulated test consisted of astronaut Chris Cassidy controlling a K10 rover at the Ames Research Center in Moffett Field, CA; Cassidy successfully deployed a polymide-film antenna while dealing with simulated terrain via a real-time video feed.

“It was a great success… and the team was thrilled with how smoothly everything went,” said Jack Burns, director of the NASA Lunar Science Institute’s Lunar University Network for Astrophysics Research. The trial was a test for a potential deployment of radio antennas on the far side of the moon, a mission that would utilize the same sort of technology used in last month’s trial. But more test are needed before such a deployment — NASA says it’ll conduct follow-up test communications between the rover and the ISS in late July and early August.

Flash Drive

June 17, 2013

Object of Interest: The Flash Drive

Posted by

Corbis-42-21082844-580.jpg

When Daniel Ellsberg decided to copy the Pentagon Papers, in 1969, he secretly reproduced them, page by page, with a photocopier. The process of duplication was slow; every complete copy of the material spanned seven thousand pages. When Edward Snowden decided to leak details of surveillance programs conducted by the National Security Agency, he was able to simply slip hundreds of documents into his pocket; the government believes that Snowden secreted them away on a small device no bigger than a pinkie finger: a flash drive.

The flash drive’s compact size, ever-increasing storage capacity, and ability to interface with any computer that has a universal-serial-bus port—which is, essentially, every computer—makes it an ideal device for covertly copying data or uploading malicious software onto computer systems. They are, consequently, an ongoing security concern. The devices are reportedly banned from the N.S.A.’s facilities; a former N.S.A. official told the Los Angeles Times that “special permission” is required to use them. Even then, the official said, “people always look at you funny.” In the magazine, Seymour Hersh reported that an incident involving a USB drive resulted in some N.S.A. unit commanders ordering “all ports on the computers on their bases to be sealed with liquid cement.”

USB flash drives are perhaps the purest form of two distinct pieces of technology: flash memory and the universal serial bus. Flash memory was invented at Toshiba in the nineteen-eighties. According to Toshiba’s timeline, the NAND variant of flash memory, which is the kind now used for storage in myriad devices, like smartphones and flash drives, was invented in 1987. The technology, which stores data in memory cells, remained incredibly expensive for well over a decade, costing hundreds of dollars per megabyte in the early to mid-nineteen-nineties. The universal serial bus was developed in the mid-nineties by a coalition of technology companies to simplify connecting devices to computers through a single, standardized port. By the end of the decade, flash memory had become inexpensive enough to begin to make its way into consumer devices, while USB succeeded in becoming a truly universal computer interface.

The first patent for a “USB-based PC flash disk” was filed in April, 1999, by the Israeli company M-Systems (which no longer exists—it was acquired by SanDisk in 2006). Later that same year, I.B.M. filed an invention disclosure by one of its employees, Shimon Shmueli, who continues to claim that he invented the USB flash drive. Trek 2000 International, a Singaporean company, was the first to actually sell a USB flash drive, which it called the ThumbDrive, in early 2000. (It won the trademark for ThumbDrive, which has come to be a generic term for the devices, only a few years ago.) Later that year, I.B.M. was the first to sell the devices in the U.S. The drive, produced by M-Systems, was called the DiskOnKey. The first model held just eight megabytes. The timing was nonetheless fortuitous: 1.44-megabyte floppy disks had long been unable to cope with expanding file sizes, and even the most popular souped-up replacement, the Zip drive, failed to truly succeed it. Optical media, despite storing large amounts of data, remained relatively inconvenient; recording data was time consuming, re-recording it even more so.

Improved manufacturing technologies have simultaneously increased flash drives’ capacity while decreasing their cost. The most popular flash drive on Amazon stores thirty-two gigabytes and costs just twenty-five dollars, while a flash drive recently announced by Kingston can hold one terabyte of data—enough for thousands of hours of audio, or well over a hundred million pages of documents—and transfer that data at speeds of a hundred and sixty to two hundred and forty megabytes per second. Few things come to mind that store more information in less space—a black hole, for instance.

More critically, as convenience drives people to share more and more information across networks, rather than through meatspace—why back up data on a spare hard drive when you can store it in the cloud for cents on the gigabyte, or burn a movie to a disc for a friend when you can share it via Dropbox?—flash drives are a convenient means of transporting large quantities of information off the grid. (Getting that data onto the flash drive in the first place may be another matter, though.) Carrying a flash drive in your pocket on the subway does not produce network traffic or metadata that can later be analyzed.

Flash drives have even been used to create a new form of a dead drop in cities around the country: the drives are embedded into walls or other public spaces, and users simply plug their device into the exposed USB port to download or upload data. Though these dead drops are largely a kind of performance art, the intent is to allow people to anonymously share data without passing it over a network—a proposition that is only growing more rarefied.

It seems certain that there will be more Daniel Ellsbergs and Edward Snowdens, and almost as certain that flash drives will be a tool they use to secretly copy and abscond with the information they need—at least until something that is even more discreet, secure, and convenient arrives.

Nitric Oxide – Build Muscle

The Surprising New Tricks Pros Are Using to Build Muscle

Reading about sports these days, we are constantly bombarded with news of top notch athletes being exposed for using illegal steroids.

Steroid use involves huge costs, legal issues, and above all, potential health problems. With such risks, you wonder why anyone would be tempted to go this route.

Fortunately, steroid use may eventually be a thing of the past.  That’s because medical researchers studying how the human body builds muscle and endurance are developing safe and legal substances which can increase the body’s ability to build muscle, without the health risks associated with steroids.

One of the most interesting fields of research surrounds a naturally occurring chemical compound called nitric oxide. Nitric oxide is a vasodilator, which means it helps move oxygen to the muscles when they need it most. Increased nitric oxide in the blood stream signals the blood vessel walls to relax, which allows more blood to flow to the body’s muscles, thus delivering more oxygen and nutrients throughout the body.

It’s been shown to lead to:

  • Drastic Muscle Gains
  • Increased Blood Flow and Oxygen Delivery
  • Boosted Strength, Endurance, and Power
  • Support for Your Immune System
  • Immediate Results
  • Total Body Transformation

While the body naturally increases nitric oxide during workouts, it’s only a limited amount and researchers have been focused on artificially increasing your nitric oxide levels.

One of the most successful products that has emerged from this research is called Factor 2.  It uses “arginines,” special amino acids specifically linked to nitric oxide production to significantly increase oxygen and nutrient flow to the muscles during workouts.  As a result, it can safely spark powerful muscle growth, muscle definition, and strength.

Factor 2 produces noticeable results by maximizing your muscle gains as you power through your workouts and within a few weeks, users are starting to notice additional muscle definition and strength.

Factor 2 is now the recognized leader in nitric oxide stimulation and legal, safe muscle and strength enhancement.  It was Bodybuilding.com’s Best New Brand of the Year (2011) and pro athletes are taking note.

Athletes like professional football player Vernon Davis have discovered the dramatic benefits of using a nitric oxide supplement. Davis has been an advocate of Factor 2 since first taking it, telling his teammates in San Francisco, “Factor 2 has proven results. I believe in results.”

Zero

Who Invented Zero?

Jessie Szalay, LiveScience Contributor
Date: 12 March 2013 Time: 06:22 PM ET

“The concept of zero, both as a placeholder and as a symbol for nothing, is a relatively recent development.”

Though humans have always understood the concept of nothing or having nothing, the concept of zero is relatively new — it only fully developed in the fifth century A.D. Before then, mathematicians struggled to perform the simplest arithmetic calculations. Today, zero — both as a symbol (or numeral) and a concept meaning the absence of any quantity — allows us to perform calculus, do complicated equations, and to have invented computers.Early history: Angled wedges

Zero was invented independently by the Babylonians, Mayans and Indians (although some researchers say the Indian number system was influenced by the Babylonians). The Babylonians got their number system from the Sumerians, the first people in the world to develop a counting system. Developed 4,000 to 5,000 years ago, the Sumerian system was positional — the value of a symbol depended on its position relative to other symbols. Robert Kaplan, author of “The Nothing That Is: A Natural History of Zero, suggests that an ancestor to the placeholder zero may have been a pair of angled wedges used to represent an empty number column. However, Charles Seife, author of “Zero: The Biography of a Dangerous Idea,” disagrees that the wedges represented a placeholder.

The Sumerians’ system passed through the Akkadian Empire to the Babylonians around 300 B.C. There, scholars agree, a symbol appeared that was clearly a placeholder — a way to tell 10 from 100 or to signify that in the number 2,025, there is no number in the hundreds column. Initially, the Babylonians left an empty space in their cuneiform number system, but when that became confusing, they added a symbol — double angled wedges — to represent the empty column. However, they never developed the idea of zero as a number.

Zero in the Americas

Six hundred years later and 12,000 miles from Babylon, the Mayans developed zero as a placeholder around A.D. 350 and used it to denote a placeholder in their elaborate calendar systems. Despite being highly skilled mathematicians, the Mayans never used zero in equations, however. Kaplan describes the Mayan invention of zero as the “most striking example of the zero being devised wholly from scratch.”

India: Where zero became a number

Some scholars assert that the Babylonian concept wove its way down to India, but others give the Indians credit for developing zero independently.

The concept of zero first appeared in India around A.D. 458. Mathematical equations were spelled out or spoken in poetry or chants rather than symbols. Different words symbolized zero, or nothing, such as “void,” “sky” or “space.” In 628, a Hindu astronomer and mathematician named Brahmagupta developed a symbol for zero — a dot underneath numbers. He also developed mathematical operations using zero, wrote rules for reaching zero through addition and subtraction, and the results of using zero in equations. This was the first time in the world that zero was recognized as a number of its own, as both an idea and a symbol.

From the Middle East to Wall Street

Over the next few centuries, the concept of zero caught on in China and the Middle East. According to Nils-Bertil Wallin of YaleGlobal, by A.D. 773, zero reached Baghdad where it became part of the Arabic number system, which is based upon the Indian system.

A Persian mathematician, Mohammed ibn-Musa al-Khowarizmi, suggested that a little circle should be used in calculations if no number appeared in the tens place. The Arabs called this circle “sifr,” or “empty.” Zero was crucial to al-Khowarizmi, who used it to invent algebra in the ninth century. Al-Khowarizmi also developed quick methods for multiplying and dividing numbers, which are known as algorithms — a corruption of his name.

Zero found its way to Europe through the Moorish conquest of Spain and was further developed by Italian mathematician Fibonacci, who used it to do equations without an abacus, then the most prevalent tool for doing arithmetic. This development was highly popular among merchants, who used Fibonacci’s equations involving zero to balance their books.

Wallin points out that the Italian government was suspicious of Arabic numbers and outlawed the use of zero. Merchants continued to use it illegally and secretively, and the Arabic word for zero, “sifr,” brought about the word “cipher,” which not only means a numeric character, but also came to mean “code.”

By the 1600s, zero was used fairly widely throughout Europe. It was fundamental in Rene Descartes’ Cartesian coordinate system and in Sir Isaac Newton’s and Gottfried Wilhem Liebniz’s developments of calculus. Calculus paved the way for physics, engineering, computers, and much of financial and economic theory.