Thursday, November 11, 2010

Vodafone offers HTC Wildfire and Samsung Galaxy S


Wildfire that the School's own characteristics and application in Flash, Galaxy and S features 16GB of onboard memory and the ability to record video in 720p HD. And the Wild Fire and the Galaxy, S should be a 5MP camera.
This Samsung Galaxy S., which challenges the new Apple released the iPhone-4, features the ability to use business applications and improves security. Android-powered Galaxy S also allows its members to get a few applications for Google's Android Market, March Communications Store T and Samsung Electronics' software applications.

That Wildfire is available free of Vodafone £ 20 per month for two-year contract, which offers 300 minutes, 500MB of data and unlimited texts.

Vodafone also offers a free S Galaxy, is £ 30 per month for two-year contract, which gives subscribers 900 minutes, 1 GB of data and unlimited texts.

Holiday Lightning Remote Kit

Holiday Lighting Remote Kit
Tired of climbing behind your Christmas tree to turn it on? I always come out with a hair full of tinsel, which is a nice look, but more New Year''s, don''t you think?

This year we''re using Smart Home''s INSTEON Wireless Remote Starter kit which controls all your outside Christmas lights, that tasteful blow-up Santa on the lawn, and the holiday Douglas fir.

Smart Home tells us that the INSTEON Holiday Lighting Starter Kit can be set up "in less than five minutes" (it took me a little longer) to control any connected device remotely using a simple "Plug and Tap" setup process. For example, to control "linked" devices, you can tap one of the RemoteLinc''s buttons on or off, or you can hold a button to brighten or dim. Each of the six ON/OFF buttons on the RemoteLinc can be set up to control every light or just one light in your home remotely. (It can be used over 150'' from the Access Point .)



For only $89.99 you can simplify your holidays -- and leave the tinsel for the blow-out New Year''s party. Some lucky POPGADGET email subscriber will win one of these, so make sure you''re on Santa''s favorites list.

HTC Wildfire and Sony Ericsson’s Xperia X10 Mini


HTC Qualcomm 528MHz processor with wildfire, 3.2-inch touchscreen, 3.5mm audio jack, microSD card slot and HSDPA flash.It also autofocus, Wi-Fi, GPS and a 5MP camera features elegant HTC Advantage sensitive user interface. Android 2.1, and other social networks such as Facebook and Twitter is used.

Wildfire HTC will also be available on 3, T-Mobile and Virgin Media network, provides shoppers with a network of beautiful selection to choose.

Vodafone pre-orders for Sony Ericsson, Ericsson x10 Mini, will be available for free in 24 months contract £ 20 per month paid to attend.

DMP Pica200 Gpu is the power behind NINTENDO 3DS (Video)

We'd never heard of Digital Media Professionals until this very moment, but we'd guess the company won't have that problem in future -- according to a press release fresh off the Japanese wire, its Pica200 GPU is the one pushing pixels to Nintendo's autostereoscopic screen. While we don't know exactly how the tiny graphics unit works or what CPU it might be paired with in a system-on-a-chip, the company claims it supports per-pixel lighting, procedural textures and antialiasing among a host of other effects, and generates 15.3 million polygons per second at its native 200MHz. What is impressive is the video after the break -- reportedly rendered entirely on the chip -- and of course, the 3DS itself, but you'll have to take our word on that.

Sony Vaio Z gets Core i7 processor and much more great!!!

Sony's rather quietly refreshed its VAIO Z laptops -- to include Intel's Core i7 processor and a new, optional 1920 x 1080 display. The new display upgrade is free until July 3rd so if you've been thinking about grabbing up a VAIO Z, now's probably the time to do it. You can check out our full review of the earlier VAIO Z here. Hit up the source if you just can't wait to start shopping.

Toshiba Satellite T235D-S1350 TruBrite 13.3-Inch Ultrathin Laptop


Now offers new products is Toshiba Satellite T235D-S1350 TruBrite 13.3-Inch Ultrathin Laptop designed to be environmentally conscious with a power-efficient, mercury-free LED backlit display that also provides brilliantly colorful imagery for photos and video thanks to the native HD TruBrite screen.The T235D-S1350 features 13.3-inch widescreen HD TruBrite LED-Backlit display (1366 x 768 resolution, 16:9 aspect ratio). The T235 has a full-sized keyboard and full-sized touchpad with multi-touch capabilities, enabling greater flexibility to browse and control what’s happening on-screen.

The T235D-S1350 has powered by 1.2 GHz Intel Pentium U5400 ultra-low voltage processor. Get the power to get things done – and freedom to keep your life moving. Advanced levels of energy-efficiency so you can go longer on a single battery charge. which weighs less than 4 pounds and measures less then 1 inch thin. Offering the performance, flexibility and functionality you expect in a standard-sized laptop but in a highly portable and efficient package,Up to 9 hours of Battery Life. Unplug – and unleash your full mobile potential. With up to nine hours of battery life the Toshiba T235 series lets you go through an entire day of school, work or errands without worrying about a recharge. the built-in webcam and Toshiba Face Recognition software on this machine, you’ll enjoy a more convenient way to communicate, log on or share your laptop among the family.USB Sleep-and-Charge port, and a hard drive impact sensor that protects your data.

new MacBook Pro 2010

Apple Store is down, and this only means one thing, that the new MacBook Pro 2010 can be released today.
All the rumours are pointing towards only one thing, that yes, indeed, Apple is most likely to update their MacBook Pro lineup today.

new macbook pro 2010
Last week, Macworld Australia heard from sources that new MacBook Pro models will be announced next Tuesday, that is today, the 13th of April. Their source is also familiar with Apple product cycles and inventory levels.
Stay tuned to find out what Apple will, most likely, reveal today. New MacBook Pro with Core i5 and i7? New iMacs? Or both?

Airwash Waterless washing machine

Airwash Waterless Washing Machine
It looks like soon we may be able to rid ourselves of the water-guzzling washing machines in our homes. The Airwash Waterless Washing Machine takes "dry-cleaning" to a new level. Instead of immersing a clothing item in water the Airwash cleans clothes using negative ions, compressed air and deodorants to clean clothes.





What is not clear is how honey dropped on my tie is going to be cleaned. I suppose there will still be times when I will have to spot clean a clothing item to get rid of organic materials.

However, in the "smart" (read that as "green" or environmentally friendly) home, any appliance that can be used to reduce the use of water in a household is to be welcomed. I can see an Airwash Machine installed in my bedroom, but other items will still need to be cleaned in the washing machine. Perhaps the Airwash Machine could cut the amount of "water washing" I do in half; that would still be a great saving.

This Airwash Machine is not available so we will have to wait a little longer before we see this in reality in our bedrooms.

iPhone 3GS Coming to India and Pakistan

After waiting for a long time, India is finally getting the iPhone 3GS via Bharti Airtel.
Amit Agarwal, of Digital Inspiration got an email today from the Airtel PR that they would soon start to sell the iPhone 3GS in India. No exact dates were revealed though.
iphone-3gs-airtel-india
New Delhi, March 19, 2010: Bharti Airtel and Apple have reached an agreement to bring iPhone 3GS, the fastest most powerful iPhone yet, to India in the coming months. For information please visit www.airtel.in/iphone3gs. For more information on iPhone, please visit www.apple.com/iphone.
I’m sure all my Indian friends who wanted an iPhone 3GS would be glad to know about this. And like I said, keeping checking Airtel as they would announce the release dates soon.

iMovie for iPhone

iMovie for iPhone app has just been announced and is now available in the App Store for $4.99 only!
It works just like the iMovie desktop app, but on a smaller screen! It gives you almost all the features for premium video editing and a lot more!

imovie-for-iphone

iMovie for iPhone Features

Here are the features of iMovie announced at the WWDC 10 a few minutes ago.
  • Records geolocation in videos
  • Transition effects
  • Themes
  • Video export to 360p, 540p and 720p
You can download it from the App Store for $4.99 for your iPhone!
imovie-iphone

Grin and Snap it

Grin and snap it
Face recognition is all the rage these days in digital cameras, but Sony appeared to have taken the technology to the next level with its "smile shutter." The technology is supposed to detect when someone is smiling in a scene and automatically snap a photo of them. When I heard about the feature, I confess I was skeptical. Nevertheless, it tickled my curiosity. While I haven't been able to get my hands on a Sony model, like the DSC-T200, with the feature, Ed Baig over at USA Today has and wrote about it today.

"Smile shutter worked but not perfectly," he scribbles. "To be sure, I captured a lot of handsome smiles. But the camera missed a number of smiles, too, or took a picture when the subject wasn't fully in the frame."

Wireless Spy Camera Pen

Wireless Spy Camera Pen


Q should really order up this Wireless Spy Camera Pen for Mr. Bond, as it is certainly something 007 would use.

When you twist the cap of the Wireless Spy Camera Pen, it begins to start transmitting. Not only that, the pen connects to a 2.5 inch LCD screen that has a range of 20 meters. The video you will see is 882x240 resolution, but it has built-in memory of 64 MB of storage. There is a support for SD and MMC memory cards for 2GB.

Like it? It even comes with a solar charger. Get it now for a price of $330.

Nokia to enter netbook market


Nokia has announced it will start to make laptops, entering a fiercely competitive but fast-growing market with a netbook running Microsoft's Windows operating system.

Nokia had said earlier this year it was considering entering the laptop industry, crossing the border between two converging industries in the opposite direction to Apple, which entered the phone industry in 2007 with the iPhone.

Nokia has seen its profit margins drop over the last quarters as handset demand has slumped, and analysts have worried that entering the PC industry, where margins are traditionally razor-thin, could hurt Nokia's profits further.

"We are fully aware what has the margin level been in the PC world. We have gone into this with our eyes wide open," said Kai Oistamo, the head of Nokia's phone unit.

"There's really an opportunity to bring fresh perspective to the PC world," he said, adding that Nokia would introduce extended battery life and continuous connectivity.

Nokia has produced PCs before, but divested the unit in 1991 when it started to focus on the mobile phone industry.

But Nokia's first netbook, the Nokia Booklet 3G, will use Microsoft's Windows software and Intel's Atom processor to offer up to 12 hours of battery life while weighing 1.25 kilograms. Netbooks are low-cost laptops optimised for surfing the Internet and performing other basic functions. Pioneered by Asustek with the hit Eee PC in 2007, netbooks have since been rolled out by other brands such as HP and Dell.

"The question is: How will Nokia differentiate? This is already a crowded market. If they manage to differentiate it's going to give them competitive advantage," said Gartner analyst Carolina Milanesi.

Cut-throat segment

Research firm IDC expects netbook shipments this year to grow more than 127 percent from 2008 to over 26 million units, outperforming the overall PC market that is expected to remain flat and a phone market which is shrinking some 10 percent.

"Nokia will be hoping that its brand and knowledge of cellular channels will play to its strengths as it addresses this crowded, cut-throat segment," said Ben Wood, director of research at CCS Insight.

"At present we see Nokia's foray into the netbook market as a niche exercise in the context of its broader business."

Nokia's choice of Windows software surprised some analysts who had expected the company to use Linux in its first laptop.

Analyst Neil Mawston from Strategy Analytics said the technology choices were a good win for the US companies.

"We believe ARM and Symbian are among the main losers from the Nokia Booklet announcement," he said.

Shares in ARM were 0.2 percent lower at 1400 GMT, underperforming slightly firmer DJ Stoxx European technology shares index. Shares in Nokia were 1.6 percent stronger at 8.91 euros, while Microsoft was 0.6 percent firmer.

Nokia said it would unveil detailed specifications, market availability and pricing of its new device on Sept 2.

A source close to Nokia said the new netbook would use the upcoming Windows 7 operating system. Microsoft says a stripped-down version of Windows 7 will be introduced to netbooks the same time as its general release on October 22.

Local media reports in Taiwan have said that Compal, the world's No. 2 contract laptop PC maker, has pitched netbook models to Nokia, but there has been no official confirmation from either side.

Nokia declined to comment on the manufacturer it uses.

Most of the world's top electronics brands typically do their own design work, but outsource the manufacturing process to contract manufacturers such as Compal and its larger rival Quanta.

Nokia has announced it will start to make laptops, entering a fiercely competitive but fast-growing market with a netbook running Microsoft's Windows operating system.

Nokia had said earlier this year it was considering entering the laptop industry, crossing the border between two converging industries in the opposite direction to Apple, which entered the phone industry in 2007 with the iPhone.

Nokia has seen its profit margins drop over the last quarters as handset demand has slumped, and analysts have worried that entering the PC industry, where margins are traditionally razor-thin, could hurt Nokia's profits further.

"We are fully aware what has the margin level been in the PC world. We have gone into this with our eyes wide open," said Kai Oistamo, the head of Nokia's phone unit.

"There's really an opportunity to bring fresh perspective to the PC world," he said, adding that Nokia would introduce extended battery life and continuous connectivity.

Nokia has produced PCs before, but divested the unit in 1991 when it started to focus on the mobile phone industry.

But Nokia's first netbook, the Nokia Booklet 3G, will use Microsoft's Windows software and Intel's Atom processor to offer up to 12 hours of battery life while weighing 1.25 kilograms. Netbooks are low-cost laptops optimised for surfing the Internet and performing other basic functions. Pioneered by Asustek with the hit Eee PC in 2007, netbooks have since been rolled out by other brands such as HP and Dell.

"The question is: How will Nokia differentiate? This is already a crowded market. If they manage to differentiate it's going to give them competitive advantage," said Gartner analyst Carolina Milanesi.

Cut-throat segment

Research firm IDC expects netbook shipments this year to grow more than 127 percent from 2008 to over 26 million units, outperforming the overall PC market that is expected to remain flat and a phone market which is shrinking some 10 percent.

"Nokia will be hoping that its brand and knowledge of cellular channels will play to its strengths as it addresses this crowded, cut-throat segment," said Ben Wood, director of research at CCS Insight.

"At present we see Nokia's foray into the netbook market as a niche exercise in the context of its broader business."

Nokia's choice of Windows software surprised some analysts who had expected the company to use Linux in its first laptop.

Analyst Neil Mawston from Strategy Analytics said the technology choices were a good win for the US companies.

"We believe ARM and Symbian are among the main losers from the Nokia Booklet announcement," he said.

Shares in ARM were 0.2 percent lower at 1400 GMT, underperforming slightly firmer DJ Stoxx European technology shares index. Shares in Nokia were 1.6 percent stronger at 8.91 euros, while Microsoft was 0.6 percent firmer.

Nokia said it would unveil detailed specifications, market availability and pricing of its new device on Sept 2.

A source close to Nokia said the new netbook would use the upcoming Windows 7 operating system. Microsoft says a stripped-down version of Windows 7 will be introduced to netbooks the same time as its general release on October 22.

Local media reports in Taiwan have said that Compal, the world's No. 2 contract laptop PC maker, has pitched netbook models to Nokia, but there has been no official confirmation from either side.

Nokia declined to comment on the manufacturer it uses.

Most of the world's top electronics brands typically do their own design work, but outsource the manufacturing process to contract manufacturers such as Compal and its larger rival Quanta.

LCD Monitor » VW266H


Personal Entertainment on Desk

Model VW266H
Recommended Retail Price AUD$619.00
Design Personal Entertainment on Desk
Specifications "Panel Size: 25.5“ Wide Screen
True Resolution: 1920X1200
Response Time: 2 ms (Gray-to-Gray)
Contrast Ratio (Max.): 20000 :1 (ASCR)
Stereo Speakers 3Wx2 stereo RMS
SPLENDID Video Intelligence Technology
SPLENDID Video Preset Modes (5 modes)
HDCP support
PC Input: DVI-D/D-Sub
PC Audio Input: 3.5mm Mini-jack
Video Input: Component(YPbPr)/HDMI
AV Audio Input: HDMI
Audio Output: SPDIF
Earphone jack: 3.5mm Mini-jack
Security 3 Years Warranty
Extra Full HD 1080p for High-resolution Digital Content Display
Multimedia Enjoyment Supported by Rich Video and PC Inputs
Advanced Video Technologies for Exceptional Visuals
Stylish and User-friendly Design for Modern Sophistication

Full HD 1080p Visual Excellence with Multimedia Enjoyment

The UL50V notebook

The UL50V notebook is designed for all-day computing with up to 12 hours battery life and is crafted from satin-brushed aluminium to an ultra-light 2.3 Kg,

Model UL50VT
Recommended Retail Price AUD$1299.00
Design Intel® Centrino® 2 processor technology - ultra-low-voltage Intel® Core™ 2 Duo processor SU7300
Specifications Switchable Nvidia GeForce G210M with 512MB DDR3 VRAM
Platform Windows 7 Home Premium, 64-Bit Edition / Windows 7 Professional 64-bit Edition
Screen 15.6" LED Backlit with DVD-RW
Hard drive 4GB RAM, 320GB HDD

The UL50V notebook is designed for all-day computing with up to 12 hours battery life and is crafted from satin-brushed aluminium to an ultra-light 2.3 Kg, under 1 inch thick. Features multi-gesture touchpad, Card Reader, Bluetooth, Wireless & HDMI

Dell AX4-5


Simple, Scalable and Affordable

The Dell AX4-5 arrays combine low cost, easy to use features with the scalability and functionality of advanced storage arrays. Dual controller models offer the excellent availability and performance that business-critical applications require. The AX4-5 can support up to four expansion enclosures spanning up to 60 hard drives. With the ability to provide consolidated storage for up to 64 hosts, the AX4-5 can provide the headroom that will keep up with your data and application growth. With both 1Gb/s iSCSI and 4Gb/s Fibre Channel models, the AX4-5 enables organisations to choose a network interconnect that is right for their environment and needs.

New Software Capabilities
•RAID 6 support offers an extra layer of data protection for customers who want to further reduce the risk of losing data based on multiple drive faults.
•Extended SAS drive support through elimination of the Expansion pack requirement for SAS drive support for configurations with up to 12 drives.
•Enhanced integration with VMware solutions through added capabilities for a operating system on a VMware Virtual Machine, including: Console-less VMware, Navisphere host Agent (Windows only), and Navisphere Server Utility (Windows only).

Tiered Storage Options to Match Your Needs

The AX4-5 supports the simple, yet powerful concept of tiered storage by having the option to easily mix SAS drives geared for performance, as required by I/O-intensive applications, with SATA drives that deliver cost-effective capacity for backup and archiving. Also, users can migrate data between different classes of drives and RAID types, dynamically and seamlessly, helping to avoid application disruption.

Storage Capacity
Up to 27TB of raw storage capacity with SAS drives or 120TB with SATA II drives

Scalability
Support for up to 64 highly available servers attached to a single AX4-5 array in either a Fibre Channel or an iSCSI SAN
Directly connect up to 4 servers via Fibre Channel (AX4-5F) or iSCSI (AX4-5I)
Four front-end 4 Gbps Fibre Channel optical (AX4-5F) or Gigabit Ethernet copper (AX4-5I) ports per array. With the use of PowerPath® failover and load balancing software for multipath I/O, users can operate all four ports simultaneously

Cache
Up to 2GB of cache (1GB per storage processor)
UPS-backed mirrored cache for the dual storage processor system
Cache-destaging to disk

RAID Levels
RAID 3, 5, 6 and 1/0

Learn More
Supported Servers
All dual and quad socket Dell PowerEdgeTM servers
Variety of Compaq® , HP® , IBM® and SUN® servers as validated by EMC

The Nokia N900


Introducing the Nokia N900
It's more than a mobile phone - it's a mobile computer.

The Nokia N900 is the next step in the evolution of the mobile phone. Packed with all the latest features and running the advanced Maemo operating system, this innovative high-performance device redefines the mobile phone into a true mobile computer.

The Nokia N900's powerful hardware and cutting-edge technology is packed into a sleek, compact design. The sharp 3.5 inch display is touch-sensitive and runs at a PC-grade 800 x 480 resolution. Tap the screen or slide it up to reveal a 3-row QWERTY keypad – you can use the Nokia N900 your way.

The Nokia N900 also comes with a 5.0 megapixel Carl Zeiss camera lens, with a software activating sliding lens cover and a bright LED flash. In addition it has a kick-stand that can act as a tripod or make it easier to watch videos.

Web browsing is also second to none on the Nokia N900. The powerful Maemo Browser has a fast Mozilla engine and gives you full access to rich interactive content, allowing you full-screen web browsing with Adobe Flash video and animation support. It's just like a browser on your home computer!

Multitasking is made easy with the live dashboard and panorama desktop. Create a desktop for your friends, one for your music and videos, and another dedicated to the web. It's really up to you. The live dashboard also lets you keep an eye on all your apps, conversations, missed calls and new messages – allowing you to run everything at once and jump instantly from one task to another.

HP EliteBook 8740w


HP EliteBook 8740w: Rugged, Powerful, and Sleek


The HP EliteBook 8740w is a highly configurable, business-rugged desktop replacement laptop that actually looks pretty cool. You can have it configured with a Core i5 or i7 processor, up to 32GB of 1333MHz DDR3 RAM, over 1TB of hard-drive space (or up to 256GB of solid-state drive space), and HP's "DreamColor" screen. Naturally, such a loaded model will set you back considerably (here's a hint--the DreamColor screen alone will cost about $600), but it's nice to have the options. For those of us who aren't flush with cash, however, the prebuilt EliteBooks still offer a lot of power in an attractive package.

The EliteBook 8740w starts at a somewhat hefty $1999. Our review model, about $3000 (as of October 1, 2010), had a 2.4GHz Intel Core i5 520M processor, 2GB of RAM, a 250GB hard drive (running at 7200 rpm), and Windows 7 Professional (64-bit). It also featured a 2-megapixel built-in Webcam, an nVidia Quadro FX 3800M graphics card with 1GB of dedicated video memory, Wi-Fi 802.11b/g/n, Bluetooth, and a DVD±RW drive.

Acer Aspire 5741G-6983


Acer Aspire 5741G-6983: Closet Gaming machine
The Acer Aspire 5741G-6983 is a tweener in the best sense the word: It fits right between a great mainstream unit and a powerful gaming laptop. It has good input ergonomics, a nice 15.6-inch, 1366-by-768-pixel display, great everyday performance, and elegant styling. Throw in an AMD Mobility Radeon HD 5470 (with 512MB of dedicated memory) for smooth video and decent gaming frame rates, and you have a laptop that can handle almost anything.

At the heart of the 5741G-6983--one of the beefier 5741 configurations--is an Intel Core i5 430M running at 2.27GHz and the aforementioned AMD GPU. To feed the CPU, Acer provides 4GB of DDR3 memory and a 500GB hard drive. A Blu-ray drive/DVD burner is part of the package as well.

Ports consist of three USB, HDMI, ethernet, VGA, audio in/out, and a five-in-one memory card reader. The ethernet port sits about two-thirds of the way forward on the left side of the unit--not a particularly convenient location. Connectivity is top-notch with both gigabit and wireless 802.11b/g/n on board.

Universal serial bus (USB) flash drive - Great new technology for storage

A USB flash drive consists of a flash memory data storage device integrated with a USB (Universal Serial Bus) 1.1 or 2.0 interface. USB flash drives are typically removable and rewritable, and physically much smaller than a floppy disk. Most weigh less than 30 g (1 oz).Storage capacities in 2010 can be as large as 256 GB with steady improvements in size and price per capacity expected. Some allow 1 million write or erase cycles and have a 10-year data retention cycle.
USB flash drives are often used for the same purposes as floppy disks were. They are smaller, faster, have thousands of times more capacity, and are more durable and reliable because of their lack of moving parts. Until approximately 2005, most desktop and laptop computers were supplied with floppy disk drives, but most recent equipment has abandoned floppy disk drives in favor of USB ports.
Flash drives use the USB mass storage standard, supported natively by modern operating systems such as Windows, Mac OS X, Linux, and other Unix-like systems. USB drives with USB 2.0 support can store more data and transfer faster than a much larger optical disc drive and can be read by many other systems such as the Xbox 360, PlayStation 3, DVD players and in some upcoming mobile smartphones.
Nothing moves mechanically in a flash drive; the term drive persists because computers read and write flash-drive data using the same system commands as for a mechanical disk drive, with the storage appearing to the computer operating system and user interface as just another drive. Flash drives are very robust mechanically.
A flash drive consists of a small printed circuit board carrying the circuit elements and a USB connector, insulated electrically and protected inside a plastic, metal, or rubberized case which can be carried in a pocket or on a key chain, for example. The USB connector may be protected by a removable cap or by retracting into the body of the drive, although it is not likely to be damaged if unprotected. Most flash drives use a standard type-A USB connection allowing plugging into a port on a personal computer, but drives for other interfaces also exist.
Most USB flash drives draw their power from the USB connection, and do not require a battery. Should not be confused with some look alike music player devices that combine the functionality of a digital audio player with flash-drive-type storage and require a battery for the player function.

The foreign exchange market (forex, FX, or currency market) is a worldwide decentralized over-the-counter financial market for the trading of currencies. Financial centers around the world function as anchors of trading between a wide range of different types of buyers and sellers around the clock, with the exception of weekends. The foreign exchange market determines the relative values of different currencies.[1]
The primary purpose of the foreign exchange market is to assist international trade and investment, by allowing businesses to convert one currency to another currency. For example, it permits a US business to import British goods and pay Pound Sterling, even though the business's income is in US dollars. It also supports speculation, and facilitates the carry trade, in which investors borrow low-yielding currencies and lend (invest in) high-yielding currencies, and which (it has been claimed) may lead to loss of competitiveness in some countries.[2]
In a typical foreign exchange transaction a party purchases a quantity of one currency by paying a quantity of another currency. The modern foreign exchange market started forming during the 1970s when countries gradually switched to floating exchange rates from the previous exchange rate regime, which remained fixed as per the Bretton Woods system.
The foreign exchange market is unique because of its
  • huge trading volume, leading to high liquidity
  • geographical dispersion
  • continuous operation: 24 hours a day except weekends, i.e. trading from 20:15 GMT on Sunday until 22:00 GMT Friday
  • the variety of factors that affect exchange rates
  • the low margins of relative profit compared with other markets of fixed income
  • the use of leverage to enhance profit margins with respect to account size
As such, it has been referred to as the market closest to the ideal of perfect competition, notwithstanding market manipulation by central banks.According to the Bank for International Settlements, average daily turnover in global foreign exchange markets is estimated at $3.98 trillion, as of April 2010 a growth of approximately 20% over the $3.21 trillion daily volume as of April 2007.
The $3.21 trillion break-down is as follows:
  • $1.005 trillion in spot transactions
  • $362 billion in outright forwards
  • $1.714 trillion in foreign exchange swaps
  • $129 billion estimated gaps in reporting
  • So it is a great technology and an easy marketing or trading and is a great facility.

The new 802.11n wireless technology standard introduced – wireless, 802.11n, wireless LAN – Network Communication industry HC


The new 802.11n wireless technology standard introduced – wireless, 802.11n, wireless LAN – Network Communication industry HC

802.11n context  Various wireless LAN technologies in today’s cutting the Warring States Period, WLAN, Bluetooth , HomeRF, UWB and other competing bloom, but the IEEE802.11 family of WLAN is the most widely used. Since 1997, IEEE802.11 standard has been implemented, there have 802.11b, 802.11a, 802.11g, 802.11e, 802.11f, 802.11h, 802.11i, 802.11j other standards or brewing, but WLAN is still facing “four noes One does not “question, namely, insufficient bandwidth, roaming inconvenient, network management is not strong, the system safe and no killer applications. Like today’s VoIP application as a new area of VoWLAN, although the industry as a WLAN is the most promising killer application, but because of the four “no”, it is difficult to further develop.

Order to achieve high bandwidth, high quality WLAN service to the wireless LAN to Ethernet performance levels, 802.11n emerged.

500Mbps wonderful prospects In transfer rate, 802.11n WLAN transmission rate can be from the current 54Mbps 802.11a and 802.11g provides up to 108Mbps, and even as high as 500Mbps. Thanks to the MIMO (Multiple Input Multiple Output) and OF DM (Orthogonal frequency division multiplexing) technology combines the application of MIMOOFDM technology, this technology not only improves the quality of wireless transmission, but also to greatly enhance the transfer rate.

Prospects: 802.11n WLAN transmission rate would reach the current transmission rate of 10 times, and can support high-quality voice, video transmission, which means that people can Offices Used Wi-Fi Mobile To make IP telephony and video telephony.

In the coverage area, 802.11n uses smart antenna technology, through multiple independent antenna of the antenna array can dynamically adjust the beam to ensure that WLAN users to receive a stable signal, and can reduce the interference of other signals . Therefore, its coverage can be extended to several square kilometers, making WLAN Mobile Of greatly enhanced.

Prospects: This makes use Notebook Computer , And PDA can be a greater range of movement, allowing WLAN signal coverage to the office, hotel and family anywhere, let us truly experience the mobile office and mobile life of convenience and pleasure.

In compatibility, 802.11n uses a software radio technology, it is a fully programmable hardware platform, making the base stations of different systems and terminals are available through the platform to achieve interoperability of different software and compatible, which makes WLAN compatibility have been greatly improved. This means that not only can achieve 802.11n WLAN will later forward compatible, but also realize the WLAN and wireless wide area network integration, such as 3G.

Standard two camps in the war

Pity is, 802.11n is in a “standard lag, product premature” embarrassing situations. 802.11n IEEE standard has not been a formal approval, but manufacturers have been using MIMOOFDM technology many Package Include Airgo, Bermai, Broadcom and Agere Systems, Atheros, Cisco , Intel And so on, products include wireless LAN, Wireless Router And so on, and has large number of PC , Notebook computer applications.

Leading technical camp 802.11n standard has two, namely WWiSE (World Wide SpectrumEfficiency) Union and TGnSync Union. Both camps hope the next generation wireless LAN standard is a priority dispute, but the two camps has become increasingly similar to the technical architecture, for example, are based on the MIMOOFDM technology, but also in the Aug. 2 news that they have decided to let go, common to the American Institute of Electrical and Electronics Engineers (IEEE) 802.11n wireless technology submitted version.

In this fierce Competition , We do not see China’s shadow, so we have to feel some regret. This is the core technology we do not have consequences. Standards war is ultimately a dispute over interests of Chinese enterprises in the WLAN core technology is difficult to achieve significant benefits, it is worth pondering.

Editorial 802.11n WLAN certainly able to bring real killer application, think about Office We can no longer use the phone, not using a desktop phone, but the use of Wi-Fi phone, you can make notebook computers without interrupting the network connection in each office, meeting room mobile office. In the family, and we can enjoy a variety of broadband wireless applications, from IPTV To the videophone can be achieved through the WLAN, more importantly, a variety of smart home appliances can be achieved through the WLAN connection can be connected with the communication system to achieve more intelligent control.

802.11n lighthouse as fog, has been getting closer and closer to us.

Genetic engineering


Methods have been developed to purify DNA from organisms, such as phenol-chloroform extraction and manipulate it in the laboratory, such as restriction digests and the polymerase chain reaction. Modern biology and biochemistry make intensive use of these techniques in recombinant DNA technology. Recombinant DNA is a man-made DNA sequence that has been assembled from other DNA sequences. They can be transformed into organisms in the form of plasmids or in the appropriate format, by using a viral vector.The genetically modified organisms produced can be used to produce products such as recombinant proteins, used in medical research, or be grown in agriculture.
As our thinking it is a great technology and is also a great facility for human being.

Bluetooth

The seemingly endless entanglement of data wires connecting today's electronic devices has become slightly less jumbled with introduction of Bluetooth technology and the creation of wireless data link.Thiis article develops into the implementation snd architure of bluetooth .It also describes the functional overview and applications of bluetooth.It gives significant advantages of bluetooth over other data transfer technologies such as IrDA and Home RF .It illustrates how a connection is made in bluetooth two environments.It also explains link security by Data Encrypotion.Finally it narrates how bluetooth will bring a new level of connectivity and convience when operating electonic devices.hese details in the article establish growing need for Bluetooth technology.Bluetooth is a method for data communication that uses short-range radio links to replace cables between computers and their connected units.Bluetooth is frequency raio technology utilizing the unlicensed 2.5GHz industrial,scientific and medical.Bluetooth was invented by L M Ericson of Sweden in 1994. The standard is named after Harald Blaatand.Bluetooth king of Denmark
Bluetooth attempts to provide significant advantages over data transfer technologies such as IrDA and home RF .IrDA is already popular inPC to pheripherals,but severly limited by short connection distance of 1m and the line of sight requirement for communication Du to its RF nature of bluetooth is not subjected such limitations.Linking of bluetooth device to another to another involves a series of inquiry and paging procedures.the inquiry process details the following steps.
*The bluetooth sends out an inquiry access code packet(inquiry packet)to search and to locate these devices.
*The existing Bluetooth devices already wihtin the area will accasionally enter an inquiry scan state of their own to troll for any inquiring devices
*When a device in the inquiry scan state receives an inquiry packet it will respond with a frequency hop synchronisation(FHS) packet i sent to the inquiry device.
Once inquiry is completed ,the paging progress follows:
*The inquiry Bluetooth device now wants to establish a connection with another bluetooth device.
*To successsfully locate and page target Bluetooth device,the paging device estimates the hop frequency and clock of the target Bluetooth device using the FHS packet received during inquiry.
*The page device pages the target device with target's device access code (DAC)thinks the target device is receiving and continues to do so until a connection is made
ADVANTAGES AND APPLICATIONS OF BLUETOOTH
Bluetooth can handle data and voice simulttaneosly.It is capable of supporting computers and their connected units >bluetooth is an open standard for wireless connectivity with supporters mostly from the PC and the cell phone industries.Its primary market is for data and voice transfer between communication devices and PC's .It is capable of supporting one asynchrous data channel and up to three synchrous voice channels,or one channel for both voice and data.Bluetooth finds in PC and peripherical networking,data synchronisationfor address books and calenders,home networking such as entertaining devices.This capability combined with adhoc device connection and automatic service discovery makes it a superoir solution for devices and Internet applications

Web camera (Webcam)

A Webcam is video capture device connected to a computer or laptop or notebook .It is mainly used in ethernet services or Wi-Fi and also using a USB port.The basic features such as photo capture and video recording,Dell webcam center has a Motion Detection feature that alows us to record live video or any thing which occurs infront of the webcam,also Remote Monitiring will let you take a photo or record video at a specific time frames and it saves it in to computer or in a website very useful to parents to watch students when they stay in Foregein countries or at longer distances by viewing through webcam is possible by using Internet facility.
*To take advantage of these resolutions we will need a computer with USB 2.0 connection capabilities
*At lower resolutions,the NX-6000 performed very well .The images are clear and its easy to adjust various colour increment or decrment,zoom in or zoom out,sharpness
*The most popular use is for videotelephony,permitting a computer to acts as a videophone.It is used in messengers whie chatting we can see them live it is applicable in most of messengers of Yahoo ,live messenger and skype services.
Webcams aer known for low manufacturing costs and flexiblity making them the lowest cost form of videotelephony.The USB video device class (UVC) specification allows for interconnectivity of webcams to computers even without propritary drivers installed.Microsoft Windows XP,LINUX and MAC OSX have USB video device class rivers built in and do not require extra drivers, although they are often installed in capable of supporting additional features.It supports JPG,WMV,BMP and AVI file formats

Friday, August 13, 2010

A visual tour of the BlackBerry Torch 9800

Here's a look at the features that just might keep BlackBerry users from jumping ship to an Android device or Apple's iPhone


Behold the BlackBerry Torch
Click here to find out more!

A visual tour of the BlackBerry Torch 9800

Here's a look at the features that just might keep BlackBerry users from jumping ship to an Android device or Apple's iPhone.

Click here to find out more!

1979 Apple Graphics Tablet vs. 2010 Apple iPad

When Apple launched the iPad earlier this year, it was the culmination of fans' long wait for the company to enter the tablet computer market. There's no doubt that Apple's iPad is a revolutionary computing device that's ushering in a new era of tablet computing. But in 1979, an earlier generation of Apple users used a different kind of Apple tablet, back when the word meant something else entirely.
The Apple Graphics Tablet was designed by Summagraphics and sold by Apple Computer Inc. for the Apple II personal microcomputer. (Summagraphics also marketed the device for other platforms as the BitPad.) To be clear, this tablet was not a stand-alone computing device like the iPad. Instead, it was an input device for creating images on the Apple II's screen, and it predated the Apple II's mouse by six years.
Apple II fan Tony Diaz had an Apple Graphics Tablet on hand at last month's KansasFest, an annual convention for diehard Apple II users. He and Computerworld's Ken Gagne, the event's marketing director, compared and contrasted Apple's original tablet with the iPad, snapping photos as they went.
Despite the three decades of technology advancements that separate the two devices, some fun comparisons are still possible. Join us for a photo face-off between the two tablets.

Meet the tablets


Apple II Graphics Tablet and iPad side by side

The Apple Graphics Tablet (left) was released in 1979 and cost $650. It connects to any Apple II and can be used to draw images at a resolution of 280 by 192 pixels. The tablet draws power directly from the Apple II and cannot be used when disconnected.
The Apple II was originally designed to be used with televisions rather than computer monitors, but the Apple Graphics Tablet produced interference that could disrupt reception of television signals. A later model was identical to its predecessor except for one notable new feature: FCC compliance.
The Apple iPad (right) was released in 2010 in six models ranging from $499 to $829. Equipped with a 1-GHz A4 system-on-a-chip and a 16GB, 32GB or 64GB flash drive, it syncs with any Macintosh or Windows machine capable of running iTunes and can run thousands of iOS applications. Its resolution is 1024 by 768 pixels on a 9.7-in. LED-backlit glossy widescreen display.

Thursday, August 12, 2010

Advantages of Online CRM Solution

Customer relationship management or CRM is defined as the process of tracking and organizing contacts with your current and prospective customers. An effective CRM practice revolves around different departments of your business process, enhances its productivity and service to match expectations of your customers. According a recent survey conducted by “Benchmark” CRM applications can increase revenue per sales person by 41% and improve lead conversion over 300%. Other advantages of CRM are customer retention, better profit margin and decreased marketing and sales cost. That is the reason Claudio Marcus, research director at Gartner commented "CRM is not part of a business strategy; CRM is the business strategy."

Advent of internet has caused a paradigm shift to age old dynamics of customer relationship management. Web based CRM or online CRM comprises s et of software applications hosted by an Application Service Provider (ASP). These applications enable you to deliver services through internet. From free of cost Google Calendar to Customized and business specific online CRM solutions are available in the market. Their installation in your business process offer manifold advantages. They lower down the cost of entry and ownership. You can implement changes according to demands of customers in a faster manner. It helps in faster transmission of information among the line of organizational hierarchy and your customers.

Accessibility is another advantage of online CRM applications. Though these application you can access your customers from any part of the globe. Implementation of online CRM solutions is easy as they do not need any costly hardware server infrastructure and deployment of backend operations. If you are going for a customized Online CRM solution, it can adopt to growing demands of your business.

In a nutshell, Online applications will lead you towards smoother operation, expanded customer base and better profit.

Desktop Computer Released

Onkyo DE411 Desktop Computer Released

Onkoyo has just released their new all-in-one desktop PC called the Onkoyo DE411. The PC will cost around $950 and has been released in Japan. The DE411 is powered by a 2GB of RAM, a 320GB hard drive space, Digital TV Tuner, DVD Burner, WiFi and most importantly to note, it runs on Windows 7 Home Premium. The DE411 has a 21.5” nettop with built in speakers. This is an ideal all-in-one desktop pc and whether you are a forex trader or a web developer, the Onoyo DE411 will serve you right.





HP Pavilion Elite Desktop Computer

The new Pavilion Elite desktop computers that are coming out seem to be a mass of multimedia excellence. These powerhouse computers are sporting a new and improved 1TB hard drive. Yes, that is a 1 terabyte drive and that is way more than enough to handle anything you might throw at it. To top that off, it comes standard with 8GB of memory and a video card that has 1GB of onboard memory as well. It may not be the gaming powerhouse some people are looking for but you should be able to play all the newest games with ease and at a more reasonable price than some of the high end gaming computers. Probably the best aspect of this new computer is the 25 inch high definition monitor that ships with the computer. With a super resolution of 1920 by 1080 you won’t have any issues with quality. You can easily scan through HD video content with no issues of lag from the video capture, and it’s always crystal clear.

Wednesday, August 11, 2010

operating system

If Mainframe Is the Answer, What Is the Question?

IBM is beginning a new initiative to raise awareness (and sales) of the System z10 mainframe. Mainframes provide high availability and security but still remain a more specialized platform that won't be right for all businesses. Understand the advant...
Brief

Linux on the Desktop: Not Just for Europe Anymore

Linux desktop adoption has been heralded, prophesied, predicted, and the subject of endless debate. Debate no longer, Info-Tech sees a gradual growth in paid Linux desktop adoption in North America and expects to see more. ...
Brief

Utility Infrastructure Reduces Business Unit Costs & IT Headaches

Utility infrastructure saves computing and storage costs while providing tight cost control for business units, but represents a major shift in IT strategy. Understand the benefits and the effort required to properly plan for a utility infrastructure...
Brief

Technology Migrations: What is the Driving Force?

The release of Windows Vista has opened a recurring question for enterprise IT staff; when do you migrate to new technology? When is the time right for change and what are some of the factors that impact the decision to migrate? Often it is younger e...
Brief

Federal Agencies: New Security Directive Dictates Desktop Configurations

Mandating the usage of standard Windows desktop configurations across all US federal agencies is a step in the right direction for government IT security. However, the timelines are tight. Agencies must make immediate plans to test and deploy these s...

MONITERS AND LCD

A liquid crystal display (LCD) is a thin, flat electronic visual display that uses the light modulating properties of liquid crystals (LCs). LCs do not emit light directly. They began to be sold in 1888.

They are used in a wide range of applications including: computer monitors, television, instrument panels, aircraft cockpit displays, signage, etc. They are common in consumer devices such as video players, gaming devices, clocks, watches, calculators, and telephones. LCDs have displaced cathode ray tube(CRT) displays in most applications. They are usually more compact, lightweight, portable, and less expensive. They are available in a wider range of screen sizes than CRT and other flat panel displays.


LCDs are more energy efficient and offer safer disposal than CRTs. Its low electrical power consumption enables it to be used in battery-powered electronic equipment. It is an electronically-modulated optical device made up of any number of pixels filled with liquid crystals and arrayed in front of a light source (backlight) or reflector to produce images in colour or monochrome. The earliest discovery leading to the development of LCD technology, the discovery of liquid crystals, dates from 1888.[1] By 2008, worldwide sales of televisions with LCD screens had surpassed the sale of CRT units.

NEW COMPUTERS



AMD Athlon II 64X2 245 2.9Ghz
Intel Core i3 530, 2.9GHz X4
1

Specs:
- AMD Athlon II 64X2 245 2.9Ghz
- 2GB DDR2 RAM
- 500GB Hard Drive
- 7 USB Ports
- 420Watt Power Supply
- Nvidia GeForce Video
- DVD-RW
1 Year hardware warranty
athlon x2

Specs:
- Intel Core i3 530, 2.9GHz
- 4GB DDR3 Ram
- 1TB Hard Drive
- Card Reader
- 22X-DVD-RW
- 480 Wattt Power Supply
Monitor and accessories not included
Price $329
Price $599
AMD Phenom II 64X4 Quad Core 620 2.6Ghz
Intel Core2 Duo E5300, 2.60-GHz


Specs:
- AMD Phenom II X4 620 2.6Ghz
- 4GB DDR II RAM
- 500GB HDD
- 8 Port USB
- 480 Watt Power Supply
- 22x DVD RW
- 6 Channel Audio
- Multi Card Reader - Nvidia GeForce Video
1 Year hardware warranty
AMD 64-Bit X2

Specs:
- Intel Pentium Dual-Core E5300, 2.6 GHz
- 3GB DDR2 Ram
- 500GB Hard Drive
- Card Reader
- 22X-DVD-RW
- 480 Wattt Power Supply
Monitor not Included
Price $499
Price $ 399
AMD Athlon II 64X2 250 3.0Ghz
Intel Core2 Duo E7500, 2.93GHz
64-bit
1 year hardware warranty
Specs:
- AMD Athlon II 64X2 250 3.0Ghz
- 3GB DDR2 RAM
- 500GB HDD
- 6 Port USB
- 22x DVD RW
- 480Watt Power supply
- Card Reader
- Ethernet
- Nvidia GeForce Video

1 Year hardware warranty


64-bit
1 Year hardware warranty


Specs:
- Intel Core2 Duo E7500, 3mb cache, 2.93-GHz
- 4GB DDR2 RAM
- 500GB Hard Drive
- Card Reader
- 7 USB 2.0 Ports
- 480 Watt Power Supply
- 22x DVD-RW
- Ethernet
Monitor and accessories not included
Price $399
Price $499
<object width="640" height="385"><param name="movie" value="http://www.youtube.com/v/JHquR9LWHVs?fs=1&amp;hl=en_US"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/JHquR9LWHVs?fs=1&amp;hl=en_US" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="640" height="385"></embed></object>

History of computing hardware

The Jacquard loom, on display at the Museum of Science and Industry in Manchester, England, was one of the first programmable devices.

The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued to be used in that sense until the middle of the 20th century. From the end of the 19th century onwards though, the word began to take on its more familiar meaning, describing a machine that carries out computations.[3]

The history of the modern computer begins with two separate technologies—automated calculation and programmability—but no single device can be identified as the earliest computer, partly because of the inconsistent application of that term. Examples of early mechanical calculating devices include the abacus, the slide rule and arguably the astrolabe and the Antikythera mechanism (which dates from about 150–100 BC). Hero of Alexandria (c. 10–70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions and when.[4] This is the essence of programmability.

The "castle clock", an astronomical clock invented by Al-Jazari in 1206, is considered to be the earliest programmable analog computer.[5] It displayed the zodiac, the solar and lunar orbits, a crescent moon-shaped pointer travelling across a gateway causing automatic doors to open every hour,[6][7] and five robotic musicians who played music when struck by levers operated by a camshaft attached to a water wheel. The length of day and night could be re-programmed to compensate for the changing lengths of day and night throughout the year.[5]

The Renaissance saw a re-invigoration of European mathematics and engineering. Wilhelm Schickard's 1623 device was the first of a number of mechanical calculators constructed by European engineers, but none fit the modern definition of a computer, because they could not be programmed.

In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a series of punched paper cards as a template which allowed his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of

It was the fusion of automatic calculation with programmability that produced the first recognizable computers. In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer, his analytical engine.[8] Limited finances and Babbage's inability to resist tinkering with the design meant that the device was never completed.

In the late 1880s, Herman Hollerith invented the recording of data on a machine readable medium. Prior uses of machine readable media, above, had been for control, not data. "After some initial trials with paper tape, he settled on punched cards ..."[9] To process these punched cards he invented the tabulator, and the keypunch machines. These three inventions were the foundation of the modern information processing industry. Large-scale automated data processing of punched cards was performed for the 1890 United States Census by Hollerith's company, which later became the core of IBM. By the end of the 19th century a number of technologies that would later prove useful in the realization of practical computers had begun to appear: the punched card, Boolean algebra, the vacuum tube (thermionic valve) and the teleprinter.

During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.

Alan Turing is widely regarded to be the father of modern computer science. In 1936 Turing provided an influential formalisation of the concept of the algorithm and computation with the Turing machine. Of his role in the modern computer, Time magazine in naming Turing one of the 100 most influential people of the 20th century, states: "The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine".[10]

The inventor of the program-controlled computer was Konrad Zuse, who built the first working computer in 1941 and later in 1955 the first computer based on magnetic storage.[11]

George Stibitz is internationally recognized as a father of the modern digital computer. While working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator he dubbed the "Model K" (for "kitchen table", on which he had assembled it), which was the first to use binary circuits to perform an arithmetic operation. Later models added greater sophistication including complex arithmetic and programmability.[12]
Defining characteristics of some early digital computers of the 1940s (In the history of computing hardware) Name First operational Numeral system Computing mechanism Programming Turing complete
Zuse Z3 (Germany) May 1941 Binary Electro-mechanical Program-controlled by punched film stock (but no conditional branch) Yes (1998)
Atanasoff–Berry Computer (US) 1942 Binary Electronic Not programmable—single purpose No
Colossus Mark 1 (UK) February 1944 Binary E
Save as Draft
lectronic Program-controlled by patch cables and switches No
Harvard Mark I – IBM ASCC (US) May 1944 Decimal Electro-mechanical Program-controlled by 24-channel punched paper tape (but no conditional branch) No
Colossus Mark 2 (UK) June 1944 Binary Electronic Program-controlled by patch cables and switches No
ENIAC (US) July 1946 Decimal Electronic Program-controlled by patch cables and switches Yes
Manchester Small-Scale Experimental Machine (Baby) (UK) June 1948 Binary Electronic Stored-program in Williams cathode ray tube memory Yes
Modified ENIAC (US) September 1948 Decimal Electronic Program-controlled by patch cables and switches plus a primitive read-only stored programming mechanism using the Function Tables as program ROM Yes
EDSAC (UK) May 1949 Binary Electronic Stored-program in mercury delay line memory Yes
Manchester Mark 1 (UK) October 1949 Binary Electronic Stored-program in Williams cathode ray tube memory and magnetic drum memory Yes
CSIRAC (Australia) November 1949 Binary Electronic Stored-program in mercury delay line memory Yes

A succession of steadily more powerful and flexible computing devices were constructed in the 1930s and 1940s, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult.Shannon 1940 Notable achievements include:
EDSAC was one of the first computers to implement the stored program (von Neumann) architecture.
Die of an Intel 80486DX2 microprocessor (actual size: 12×6.75 mm) in its packaging.

* Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine featuring binary arithmetic, including floating point arithmetic and a measure of programmability. In 1998 the Z3 was proved to be Turing complete, therefore being the world's first operational computer.[13]
* The non-programmable Atanasoff–Berry Computer (1941) which used vacuum tube based computation, binary numbers, and regenerative capacitor memory. The use of regenerative memory allowed it to be much more compact than its peers (being approximately the size of a large desk or workbench), since intermediate results could be stored and then fed back into the same set of computation elements.
* The secret British Colossus computers (1943),[14] which had limited programmability but demonstrated that a device using thousands of tubes could be reasonably reliable and electronically reprogrammable. It was used for breaking German wartime codes.
* The Harvard Mark I (1944), a large-scale electromechanical computer with limited programmability.
* The U.S. Army's Ballistic Research Laboratory ENIAC (1946), which used decimal arithmetic and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however, ENIAC had an inflexible architecture which essentially required rewiring to change its programming.

Several developers of ENIAC, recognizing its flaws, came up with a far more flexible and elegant design, which came to be known as the "stored program architecture" or von Neumann architecture. This design was first formally described by John von Neumann in the paper First Draft of a Report on the EDVAC, distributed in 1945. A number of projects to develop computers based on the stored-program architecture commenced around this time, the first of these being completed in Great Britain. The first to be demonstrated working was the Manchester Small-Scale Experimental Machine (SSEM or "Baby"), while the EDSAC, completed a year after SSEM, was the first practical implementation of the stored program design. Shortly thereafter, the machine originally described by von Neumann's paper—EDVAC—was completed but did not see full-time use for an additional two years.

Nearly all modern computers implement some form of the stored-program architecture, making it the single trait by which the word "computer" is now defined. While the technologies used in computers have changed dramatically since the first electronic, general-purpose computers of the 1940s, most still use the von Neumann architecture.

Beginning in the 1950s, Soviet scientists Sergei Sobolev and Nikolay Brusentsov conducted research on ternary computers, devices that operated on a base three numbering system of -1, 0, and 1 rather than the conventional binary numbering system upon which most computers are based. They designed the Setun, a functional ternary computer, at Moscow State University. The device was put into limited production in the Soviet Union, but supplanted by the more common binary architecture.

Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s had been largely replaced by transistor-based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorised computer was demonstrated at the University of Manchester in 1953.[15] In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, further decreased size and cost and further increased speed and reliability of computers. By the late 1970s, many products such as video recorders contained dedicated computers called microcontrollers, and they started to appear as a replacement to mechanical controls in domestic appliances such as washing machines. The 1980s witnessed home computers and the now ubiquitous personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household[citation needed].

Modern smartphones are fully programmable computers in their own right, and as of 2009 may well be the most common form of such computers in existence[citation needed].
Stored program architecture
Main articles: Computer program and Computer programming

The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that a list of instructions (the program) can be given to the computer and it will store them and carry them out at some time in the future.

In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction.

Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention.

Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time—with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. For example:

mov #0, sum ; set sum to 0
mov #1, num ; set num to 1
loop: add num, sum ; add num to sum
add #1, num ; add 1 to num
cmp num, #1000 ; compare num to 1000
ble loop ; if num <= 1000, go back to 'loop' halt ; end of program. stop running Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in about a millionth of a second.[16] However, computers cannot "think" for themselves in the sense that they only solve problems in exactly the way they are programmed to. An intelligent human faced with the above addition task might soon realize that instead of actually adding up all the numbers one can simply use the equation 1+2+3+...+n = {{n(n+1)} \over 2} and arrive at the correct answer (500,500) with little work.[17] In other words, a computer programmed to add up the numbers one by one as in the example above would do exactly that without regard to efficiency or alternative solutions. Programs A 1970s punched card containing one line from a FORTRAN program. The card reads: "Z(1) = Y + W(1)" and is labelled "PROJ039" for identification purposes. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely make a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors. Errors in computer programs are called "bugs". Bugs may be benign and not affect the usefulness of the program, or have only subtle effects. But in some cases they may cause the program to "hang"—become unresponsive to input such as mouse clicks or keystrokes, or to completely fail or "crash". Otherwise benign bugs may sometimes may be harnessed for malicious intent by an unscrupulous user writing an "exploit"—code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.[18] In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode, the command to multiply them would have a different opcode and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from—each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[19] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember—a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) tend to be unique to a particular type of computer. For instance, an ARM architecture computer (such as may be found in a PDA or a hand-held videogame) cannot understand the machine language of an Intel Pentium or the AMD Athlon 64 computer that might be in a PC.[20] Though considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually "compiled" into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[21] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge. Example A traffic light showing red Suppose a computer is being employed to operate a traffic light at an intersection between two streets. The computer has the following three basic instructions. 1. ON(Streetname, Color) Turns the light on Streetname with a specified Color on. 2. OFF(Streetname, Color) Turns the light on Streetname with a specified Color off. 3. WAIT(Seconds) Waits a specifed number of seconds. 4. START Starts the program 5. REPEAT Tells the computer to repeat a specified part of the program in a loop. Comments are marked with a // on the left margin. Comments in a computer program do not affect the operation of the program. They are not evaluated by the computer. Assume the streetnames are Broadway and Main.



The Jacquard loom, on display at the Museum of Science and Industry in Manchester, England, was one of the first programmable devices.

The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued to be used in that sense until the middle of the 20th century. From the end of the 19th century onwards though, the word began to take on its more familiar meaning, describing a machine that carries out computations.[3]

The history of the modern computer begins with two separate technologies—automated calculation and programmability—but no single device can be identified as the earliest computer, partly because of the inconsistent application of that term. Examples of early mechanical calculating devices include the abacus, the slide rule and arguably the astrolabe and the Antikythera mechanism (which dates from about 150–100 BC). Hero of Alexandria (c. 10–70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions and when.[4] This is the essence of programmability.

The "castle clock", an astronomical clock invented by Al-Jazari in 1206, is considered to be the earliest programmable analog computer.[5] It displayed the zodiac, the solar and lunar orbits, a crescent moon-shaped pointer travelling across a gateway causing automatic doors to open every hour,[6][7] and five robotic musicians who played music when struck by levers operated by a camshaft attached to a water wheel. The length of day and night could be re-programmed to compensate for the changing lengths of day and night throughout the year.[5]

The Renaissance saw a re-invigoration of European mathematics and engineering. Wilhelm Schickard's 1623 device was the first of a number of mechanical calculators constructed by European engineers, but none fit the modern definition of a computer, because they could not be programmed.

In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a series of punched paper cards as a template which allowed his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of

It was the fusion of automatic calculation with programmability that produced the first recognizable computers. In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer, his analytical engine.[8] Limited finances and Babbage's inability to resist tinkering with the design meant that the device was never completed.

In the late 1880s, Herman Hollerith invented the recording of data on a machine readable medium. Prior uses of machine readable media, above, had been for control, not data. "After some initial trials with paper tape, he settled on punched cards ..."[9] To process these punched cards he invented the tabulator, and the keypunch machines. These three inventions were the foundation of the modern information processing industry. Large-scale automated data processing of punched cards was performed for the 1890 United States Census by Hollerith's company, which later became the core of IBM. By the end of the 19th century a number of technologies that would later prove useful in the realization of practical computers had begun to appear: the punched card, Boolean algebra, the vacuum tube (thermionic valve) and the teleprinter.

During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.

Alan Turing is widely regarded to be the father of modern computer science. In 1936 Turing provided an influential formalisation of the concept of the algorithm and computation with the Turing machine. Of his role in the modern computer, Time magazine in naming Turing one of the 100 most influential people of the 20th century, states: "The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine".[10]

The inventor of the program-controlled computer was Konrad Zuse, who built the first working computer in 1941 and later in 1955 the first computer based on magnetic storage.[11]

George Stibitz is internationally recognized as a father of the modern digital computer. While working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator he dubbed the "Model K" (for "kitchen table", on which he had assembled it), which was the first to use binary circuits to perform an arithmetic operation. Later models added greater sophistication including complex arithmetic and programmability.[12]
Defining characteristics of some early digital computers of the 1940s (In the history of computing hardware) Name First operational Numeral system Computing mechanism Programming Turing complete
Zuse Z3 (Germany) May 1941 Binary Electro-mechanical Program-controlled by punched film stock (but no conditional branch) Yes (1998)
Atanasoff–Berry Computer (US) 1942 Binary Electronic Not programmable—single purpose No
Colossus Mark 1 (UK) February 1944 Binary E
Save as Draft
lectronic Program-controlled by patch cables and switches No
Harvard Mark I – IBM ASCC (US) May 1944 Decimal Electro-mechanical Program-controlled by 24-channel punched paper tape (but no conditional branch) No
Colossus Mark 2 (UK) June 1944 Binary Electronic Program-controlled by patch cables and switches No
ENIAC (US) July 1946 Decimal Electronic Program-controlled by patch cables and switches Yes
Manchester Small-Scale Experimental Machine (Baby) (UK) June 1948 Binary Electronic Stored-program in Williams cathode ray tube memory Yes
Modified ENIAC (US) September 1948 Decimal Electronic Program-controlled by patch cables and switches plus a primitive read-only stored programming mechanism using the Function Tables as program ROM Yes
EDSAC (UK) May 1949 Binary Electronic Stored-program in mercury delay line memory Yes
Manchester Mark 1 (UK) October 1949 Binary Electronic Stored-program in Williams cathode ray tube memory and magnetic drum memory Yes
CSIRAC (Australia) November 1949 Binary Electronic Stored-program in mercury delay line memory Yes

A succession of steadily more powerful and flexible computing devices were constructed in the 1930s and 1940s, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult.Shannon 1940 Notable achievements include:
EDSAC was one of the first computers to implement the stored program (von Neumann) architecture.
Die of an Intel 80486DX2 microprocessor (actual size: 12×6.75 mm) in its packaging.

* Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine featuring binary arithmetic, including floating point arithmetic and a measure of programmability. In 1998 the Z3 was proved to be Turing complete, therefore being the world's first operational computer.[13]
* The non-programmable Atanasoff–Berry Computer (1941) which used vacuum tube based computation, binary numbers, and regenerative capacitor memory. The use of regenerative memory allowed it to be much more compact than its peers (being approximately the size of a large desk or workbench), since intermediate results could be stored and then fed back into the same set of computation elements.
* The secret British Colossus computers (1943),[14] which had limited programmability but demonstrated that a device using thousands of tubes could be reasonably reliable and electronically reprogrammable. It was used for breaking German wartime codes.
* The Harvard Mark I (1944), a large-scale electromechanical computer with limited programmability.
* The U.S. Army's Ballistic Research Laboratory ENIAC (1946), which used decimal arithmetic and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however, ENIAC had an inflexible architecture which essentially required rewiring to change its programming.

Several developers of ENIAC, recognizing its flaws, came up with a far more flexible and elegant design, which came to be known as the "stored program architecture" or von Neumann architecture. This design was first formally described by John von Neumann in the paper First Draft of a Report on the EDVAC, distributed in 1945. A number of projects to develop computers based on the stored-program architecture commenced around this time, the first of these being completed in Great Britain. The first to be demonstrated working was the Manchester Small-Scale Experimental Machine (SSEM or "Baby"), while the EDSAC, completed a year after SSEM, was the first practical implementation of the stored program design. Shortly thereafter, the machine originally described by von Neumann's paper—EDVAC—was completed but did not see full-time use for an additional two years.

Nearly all modern computers implement some form of the stored-program architecture, making it the single trait by which the word "computer" is now defined. While the technologies used in computers have changed dramatically since the first electronic, general-purpose computers of the 1940s, most still use the von Neumann architecture.

Beginning in the 1950s, Soviet scientists Sergei Sobolev and Nikolay Brusentsov conducted research on ternary computers, devices that operated on a base three numbering system of -1, 0, and 1 rather than the conventional binary numbering system upon which most computers are based. They designed the Setun, a functional ternary computer, at Moscow State University. The device was put into limited production in the Soviet Union, but supplanted by the more common binary architecture.

Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s had been largely replaced by transistor-based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorised computer was demonstrated at the University of Manchester in 1953.[15] In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, further decreased size and cost and further increased speed and reliability of computers. By the late 1970s, many products such as video recorders contained dedicated computers called microcontrollers, and they started to appear as a replacement to mechanical controls in domestic appliances such as washing machines. The 1980s witnessed home computers and the now ubiquitous personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household[citation needed].

Modern smartphones are fully programmable computers in their own right, and as of 2009 may well be the most common form of such computers in existence[citation needed].
Stored program architecture
Main articles: Computer program and Computer programming

The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that a list of instructions (the program) can be given to the computer and it will store them and carry them out at some time in the future.

In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction.

Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention.

Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time—with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. For example:

mov #0, sum ; set sum to 0
mov #1, num ; set num to 1
loop: add num, sum ; add num to sum
add #1, num ; add 1 to num
cmp num, #1000 ; compare num to 1000
ble loop ; if num <= 1000, go back to 'loop' halt ; end of program. stop running Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in about a millionth of a second.[16] However, computers cannot "think" for themselves in the sense that they only solve problems in exactly the way they are programmed to. An intelligent human faced with the above addition task might soon realize that instead of actually adding up all the numbers one can simply use the equation 1+2+3+...+n = {{n(n+1)} \over 2} and arrive at the correct answer (500,500) with little work.[17] In other words, a computer programmed to add up the numbers one by one as in the example above would do exactly that without regard to efficiency or alternative solutions. Programs A 1970s punched card containing one line from a FORTRAN program. The card reads: "Z(1) = Y + W(1)" and is labelled "PROJ039" for identification purposes. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely make a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors. Errors in computer programs are called "bugs". Bugs may be benign and not affect the usefulness of the program, or have only subtle effects. But in some cases they may cause the program to "hang"—become unresponsive to input such as mouse clicks or keystrokes, or to completely fail or "crash". Otherwise benign bugs may sometimes may be harnessed for malicious intent by an unscrupulous user writing an "exploit"—code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.[18] In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode, the command to multiply them would have a different opcode and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from—each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[19] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember—a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) tend to be unique to a particular type of computer. For instance, an ARM architecture computer (such as may be found in a PDA or a hand-held videogame) cannot understand the machine language of an Intel Pentium or the AMD Athlon 64 computer that might be in a PC.[20] Though considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually "compiled" into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[21] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge. Example A traffic light showing red Suppose a computer is being employed to operate a traffic light at an intersection between two streets. The computer has the following three basic instructions. 1. ON(Streetname, Color) Turns the light on Streetname with a specified Color on. 2. OFF(Streetname, Color) Turns the light on Streetname with a specified Color off. 3. WAIT(Seconds) Waits a specifed number of seconds. 4. START Starts the program 5. REPEAT Tells the computer to repeat a specified part of the program in a loop. Comments are marked with a // on the left margin. Comments in a computer program do not affect the operation of the program. They are not evaluated by the computer. Assume the streetnames are Broadway and Main.<object width="640" height="385"><param name="movie" value="http://www.youtube.com/v/gas2Xi0rW6A?fs=1&amp;hl=en_US"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/gas2Xi0rW6A?fs=1&amp;hl=en_US" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="640" height="385"></embed></object>

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Powered by Blogger
//=(thisNum-displayPageNum-1)&&p<(thisNum+displayPageNum)){if(fFlag==0&&p==thisNum-2){if(thisNum==2){upPageHtml=''+upPageWord+''}else{upPageHtml=''+upPageWord+''}fFlag++}if(p==(thisNum-1)){html+=''+thisNum+''}else{if(p==0){html+='1'}else{html+=''+(p+1)+''}}if(eFlag==0&&p==thisNum){downPageHtml=' '+downPageWord+'';eFlag++}}}if(thisNum>1){html=''+upPageHtml+' '+html+' '}html='
Pages '+(postNum-1)+''+html;if(thisNum<(postNum-1)){html+=downPageHtml}if(postNum==1)postNum++;html+='
';var pageArea=document.getElementsByName("pageArea");var blogPager=document.getElementById("blog-pager");if(postNum<=2){html=''}for(var p=0;p0){html=''}if(blogPager){blogPager.innerHTML=html}}function showpageCount2(json){var thisUrl=home_page_url;var htmlMap=new Array();var isLablePage=thisUrl.indexOf("/search/label/")!=-1;var thisLable=isLablePage?thisUrl.substr(thisUrl.indexOf("/search/label/")+14,thisUrl.length):"";thisLable=thisLable.indexOf("?")!=-1?thisLable.substr(0,thisLable.indexOf("?")):thisLable;var thisNum=1;var postNum=1;var itemCount=0;var fFlag=0;var eFlag=0;var html='';var upPageHtml='';var downPageHtml='';var labelHtml='';var thisUrl=home_page_url;for(var i=0,post;post=json.feed.entry[i];i++){var timestamp1=post.published.$t.substring(0,19)+post.published.$t.substring(23,29);timestamp=encodeURIComponent(timestamp1);var title=post.title.$t;if(title!=''){if(itemCount==0||(itemCount%pageCount==(pageCount-1))){if(thisUrl.indexOf(timestamp)!=-1){thisNum=postNum}if(title!='')postNum++;htmlMap[htmlMap.length]='/search/label/'+thisLable+'?updated-max='+timestamp+'&max-results='+pageCount}}itemCount++}for(var p=0;p=(thisNum-displayPageNum-1)&&p<(thisNum+displayPageNum)){if(fFlag==0&&p==thisNum-2){if(thisNum==2){upPageHtml=labelHtml+upPageWord+''}else{upPageHtml=''+upPageWord+''}fFlag++}if(p==(thisNum-1)){html+=''+thisNum+''}else{if(p==0){html=labelHtml+'1'}else{html+=''+(p+1)+''}}if(eFlag==0&&p==thisNum){downPageHtml=' '+downPageWord+'';eFlag++}}}if(thisNum>1){if(!isLablePage){html=''+upPageHtml+' '+html+' '}else{html=''+upPageHtml+' '+html+' '}}html='
Pages ('+(postNum-1)+')'+html;if(thisNum<(postNum-1)){html+=downPageHtml}if(postNum==1)postNum++;html+='
';var pageArea=document.getElementsByName("pageArea");var blogPager=document.getElementById("blog-pager");if(postNum<=2){html=''}for(var p=0;p0){html=''}if(blogPager){blogPager.innerHTML=html}}var home_page_url=location.href;var thisUrl=home_page_url;if(thisUrl.indexOf("/search/label/")!=-1){if(thisUrl.indexOf("?updated-max")!=-1){var lblname1=thisUrl.substring(thisUrl.indexOf("/search/label/")+14,thisUrl.indexOf("?updated-max"))}else{var lblname1=thisUrl.substring(thisUrl.indexOf("/search/label/")+14,thisUrl.indexOf("?&max"))}}var home_page="/";if(thisUrl.indexOf("?q=")==-1){if(thisUrl.indexOf("/search/label/")==-1){document.write('