Saturday, March 12, 2011

UDP


User Datagram Protocol (UDP) 




The User Datagram Protocol (UDP) is one of the core members of the Internet Protocol Suite, the set of network protocols used for the Internet. With UDP, computer applications can send messages, in this case referred to as datagrams, to other hosts on an Internet Protocol (IP) network without requiring prior communications to set up special transmission channels or data paths. The protocol was designed by David P. Reed in 1980

This User Datagram  Protocol  (UDP)  is  defined  to  make  available  a
datagram   mode  of  packet-switched   computer   communication  in  the
environment  of  an  interconnected  set  of  computer  networks.   This
protocol  assumes  that the Internet  Protocol  (IP)  [1] is used as the
underlying protocol.



  UDP uses a simple transmission model without implicit hand-shaking dialogues for providing reliability, ordering, or data integrity. Thus, UDP provides an unreliable service and datagrams may arrive out of order, appear duplicated, or go missing without notice. UDP assumes that error checking and correction is either not necessary or performed in the application, avoiding the overhead of such processing at the network interface level. Time-sensitive applications often use UDP because dropping packets is preferable to waiting for delayed packets, which may not be an option in a real-time system.[2] If error correction facilities are needed at the network interface level, an application may use the Transmission Control Protocol (TCP) or Stream Control Transmission Protocol (SCTP) which are designed for this purpose.






Format


                  0      7 8     15 16    23 24    31
                 +--------+--------+--------+--------+
                 |     Source      |   Destination   |
                 |      Port       |      Port       |
                 +--------+--------+--------+--------+
                 |                 |                 |
                 |     Length      |    Checksum     |
                 +--------+--------+--------+--------+
                 |
                 |          data octets ...
                 +---------------- ...

                      User Datagram Header Format









Destination  Port has a meaning  within  the  context  of  a  particular
internet destination address.

Length  is the length  in octets  of this user datagram  including  this
header  and the data.   (This  means  the minimum value of the length is
eight.)

Checksum is the 16-bit one's complement of the one's complement sum of a
pseudo header of information from the IP header, the UDP header, and the
data,  padded  with zero octets  at the end (if  necessary)  to  make  a
multiple of two octets.

The pseudo  header  conceptually prefixed to the UDP header contains the
source  address,  the destination  address,  the protocol,  and the  UDP
length.   This information gives protection against misrouted datagrams.
This checksum procedure is the same as is used in TCP.

                  0      7 8     15 16    23 24    31
                 +--------+--------+--------+--------+
                 |          source address           |
                 +--------+--------+--------+--------+
                 |        destination address        |
                 +--------+--------+--------+--------+
                 |  zero  |protocol|   UDP length    |
                 +--------+--------+--------+--------+

If the computed  checksum  is zero,  it is transmitted  as all ones (the
equivalent  in one's complement  arithmetic).   An all zero  transmitted
checksum  value means that the transmitter  generated  no checksum  (for
debugging or for higher level protocols that don't care).


AIRTEL 3G FOR PC

Airtel Free 3g and 2g by using AirVPN Client [ Tested ]

Connect Your Mobile to Nokia PC Suite With following Setting

apn- airtelgprs.com

for free gprs you should Download this software 




1. First Of All Go to The Site



Register For Free And Get Your Username And Password

2. Then Log On On airvpn.org

3.Then Go To Members Area Then Click Access Without Client

4. There Select Free - UDP - 53

5.Tick Mark On I have read and I accept the Terms of Service
And On I HEREBY EXPLICITLY ACCEPT POINTS 8, 10, 11

Now Click On Generate And Save The File

Now The File Will Be saved as air.zip

Now Extract all The Contents Of The Zip File In The Folder c:\Program File\OpenVPN\config\

Now Run[Not You] But The Open VPN GUI
Then It Will Get Minimised On Besides Date On The Right Hand Corner Of The TaskBar
Now Right Click On The OpenVPN icon and Click Connect

Whoa. Now You Can Run Any App For Free On UDP Connection... No Proxy Details Need To Be Filled In Firefox

Note: The Tutorial I have Given is Using OpenVPN Client , But However You Can Try AirVPN client and say The Results

I Have Used UDP On Port 53 For Generation

You May Use
 Others Such As
Free  -UDP  -53 FOR airtel




Free - UDP - 443

Free - TCP - 443

Free - UDP - 80

Free - TCP - 80

Or Free - TCP - 53

Speed Is Limited  1mbps


User Datagram Protocol      


         UDP applications use datagram sockets to establish host-to-host communications. An application binds a socket to its endpoint of data transmission, which is a combination of an IP address and a service port. A port is a software structure that is identified by the port number, a 16 bit integer value, allowing for port numbers between 0 and 65535. Port 0 is reserved, but is a permissible source port value if the sending process does not expect messages in response.

Intel Graphics Media Accelerator


   The GMA line of GPUs replaces the earlier "Intel Extreme Graphics", and the Intel740 line, which were discrete units in the form of AGP and PCI cards. Later, Intel integrated the i740 core into the Intel 810 chipset.
The original architecture of GMA systems supported only a few functions in hardware, and relied on the host CPU to handle at least some of the graphics pipeline, further decreasing performance. However, with the introduction of Intel’s 4th generation of GMA architecture (GMA X3000) in 2006, many of the functions are now built into the hardware, providing an increase in performance. The 4th generation of GMA combines fixed function capabilities with a threaded array of programmable executions units, providing advantages to both graphics and video performance. Many of the advantages of the new GMA architecture come from the ability to flexibly switch as needed between executing graphics-related tasks or video-related tasks. While GMA performance has been widely criticized in the past as being too slow for computer games, the latest GMA generation should ease many of those concerns for the casual gamer.
Despite similarities, Intel's main series of GMA IGPs is not based on the PowerVR technology Intel licensed from Imagination Technologies. Intel used the low-power PowerVR MBX designs in chipsets supporting their XScale platform, and since the sale of XScale in 2006 has licensed the PowerVR SGX and used it in the GMA 500 IGP for use with their Atom platform.
HD Graphics 
With the introduction of Arrandale-based Core i3, Core i5, and Core i7 processors, graphics cores were now built into the processor package itself. The integrated graphics chips are built on a 45 nm process and are much more power efficient than previous generation GMA cores. The graphics chips on the mobile Arrandale processors include a feature similar to Turbo Boost called dynamic frequency scaling, which allows it to gain a little extra headway.


Graphics cores
GMA 900
The GMA 900 was the first graphics core produced under Intel's Graphics Media Accelerator product name, and was incorporated in the Intel 910G, 915G, and 915Gx chipsets.
The 3D architecture of the GMA 900 was a significant upgrade from the previous Extreme 3D graphics processors. It is a 4 pixel per clock cycle design supporting DirectX 9 pixel shader model 2.0. It operates at a clock rate ranging from 160 to 333 MHz, depending on the particular chipset. At 333 MHz, it has a peak pixel fill-rate of 1332 megapixels per second. However, the architecture still lacks support for hardware transform and lighting and the similar vertex shader technologies.
Like previous Intel integrated graphics parts, the GMA 900 has hardware support for MPEG-2 motion compensation, color-space conversion and DirectDraw overlay.
The processor uses different separate clock generators for display and render cores. The display unit includes a 400 MHz RAMDAC, 2 25–200 Mpixel/s serial DVO ports, 


GMA 950
The GMA 950 is Intel's second-generation graphics core, which was also referred by Intel as 'Gen 3.5 Integrated Graphics Engine' in datasheets. It is used in the Intel 940GML, 945G, 945GU and 945GT system chipsets. The amount of video-decoding hardware has increased; VLD, iDCT, and dual video overlay windows are supposed to be handled in hardware. However in a feature comparison document[3] it is noted, that VLD and iDCT are not supported until GMA 3100 (on G33 chipsets only). The maximum core clock is up to 400 MHz (on Intel 945G, 945GC, 945GZ, 945GSE), boosting pixel fill-rate to a theoretical 1600 megapixels/s.
The GMA 950 shares the same architectural weakness as the GMA 900: no hardware geometry processing. Neither basic hardware transform and lighting,nor more advanced vertex shaders are handled in the GMA hardware.


GMA 3000
The 946GZ, Q965, and Q963 chipsets use the GMA 3000 chip.The GMA 3000 3D core is very different from the X3000, despite their similar names. It is based more directly on the previous generation GMA 900 and GMA 950 graphics, and belonging to the same "i915" family with them. It has pixel and vertex shaders which only support Shader Model 2.0b features, and the vertex shaders are still only software-emulated. In addition, hardware video acceleration such as hardware-based iDCT computation, ProcAmp (video stream independent color correction), and VC-1 decoding are not implemented in hardware. Of the GMA 3000-equipped chipsets, only the Q965 retains dual independent display support. The core speed is rated at 400 MHz with 1.6 Gpixel/s fill rate in datasheets, but was listed as 667 MHz core in the white paper.
The memory controller can now address a maximum of 256 MB of system memory, and the integrated serial DVO ports have increased top speed to 270Mpixel/s.


GMA 3100
The G31, G33, Q33 and Q35 chipsets use the GMA 3100, which is DirectX 9 capable. The 3D core is very similar to the older GMA 3000, including the lack of hardware accelerated vertex shaders.


GMA X3000
The GMA X3000 for desktop was "substantially redesigned" when compared to previous GMA iterations and it is used in the Intel G965 north bridge controller.The GMA X3000 was launched in July 2006.X3000's underlying 3D rendering hardware is organized as a unified shader processor consisting of 8 scalar execution units. Each pipeline can process video, vertex, or texture operations. A central scheduler dynamically dispatches threads to pipeline resources, to maximize rendering throughput (and decrease the impact of individual pipeline stalls.) However, due to the scalar nature of the execution units, they can only process data on a single pixel component at a time.[11] The GMA X3000 supports DirectX 9.0 with vertex and pixel Shader Model 3.0 features.
The processor consists of different clock domains, meaning that the entire chip does not operate the same clock speed. This causes some difficulty when measuring peak throughput of its various functions. Further adding to the confusion, it is listed as 667 MHz in Intel G965 white paper, but listed as 400 MHz in Intel G965 datasheet. There are various rules that define the IGP's processing capabilities.
Memory controller can now address maximum 384 MB memory according to white paper, but only 256 MB in datasheet.


GMA X3100
Information: The GMA X3100 is the mobile version of the GMA X3000 used in the Intel GL960/GM965 chipsets and also in the GS965 chipset. The X3100 supports hardware transform and lighting, up to 128 programmable shader units, and up to 384 MB memory. Its display cores can run up to 333 MHz on GM965 and 320 MHz on GL960. Its render cores can run up to 500 MHz on GM965 and 400 MHz on GL960. The X3100 display unit includes a 300 MHz RAMDAC, two 25–112 MHz LVDS transmitters, 2 DVO encoders, and a TV encoder. In addition, the hardware supports DirectX 10.0, Shader Model 4.0 and OpenGL 1.5


GMA X3500
GMA X3500 is an upgrade of the GMA X3000 and used in the desktop G35. The shaders support shader model 4.0 features. Architecturally, the GMA X3500 is very similar to the GMA X3000, with both GMAs running at 667 MHz. The major difference between them is that the GMA X3500 supports Shader Model 4.0 and DirectX 10, whereas the earlier X3000 supports Shader Model 3.0 and DirectX 9.The X3500 also adds hardware-assistance for playback of VC-1 video.


GMA X4500
The GMA X4500 and the GMA X4500HD for desktop were launched in June 2008. The GMA X4500 is used in the G43 chipsetand the GMA X4500HD is used in the G45 chipset. The GMA X4500 is also used in the G41 chipset, which was released in September 2008.
The GMA 4500MHD for laptops was launched on July 16, 2008. Featurewise, the 4500MHD is identical to its desktop cousin, the X4500HD.[citation needed] It had been previously rumored that a cost-reduced version, the GMA 4500, was to be launched in late 2008 or early 2009 and was to be used in the upcoming Q43 and Q45 chipsets. But in practice the Q43 and Q45 Chipsets also use the GMA X4500.
The difference between the GMA X4500 and the GMA X4500HD is that the GMA X4500HD is capable of "full 1080p high-definition video playback, including Blu-ray disc movies",
Like the X3500, X4500 supports DirectX 10 and Shader Model 4.0 features. Intel designed the GMA X4500 to be 200% faster than the GMA 3100 (G33 chipset) in 3DMark06 performance and 70% faster than the GMA X3500 (G35 chipset).


GMA 500
The Intel SCH (System Controller Hub; codenamed Poulsbo) for the Atom processor Z5xx series features a GMA 500 graphic system. Rather than being developed in-house, this core is a PowerVR SGX 535 core licensed from Imagination Technologies.[24] Intel describes this as "a flexible, programmable architecture that supports shader-based technology, 2D, 3D and advanced 3D graphics, high-definition video decode, and image processing. Features include screen tiling, internal true color processing, zero overhead anti-aliasing, programmable shader 3D accelerator, and 32-bit floating-point operations.

GAMING GRAPHICS


AMD's Radeon HD 6990 has no trouble establishing performance superiority. But does speed at any cost sacrifice too much of the user experience?


In drag racing, they say ‘a chase is a race.’ In other words, if you floor it and the guy next to you follows suit, that’s a race, and you’d better be prepared to pay up at the finish if it’s a money contest.


Both AMD and Nvidia have ridiculous dual-GPU hot rods they’ve been tweaking and tuning for months. Understandably, they want to stay secretive about their respective power plants. But neither one seems willing to mash the pedal and risk an embarrassing second-place finish. It’s a good thing that these two companies don’t live their lives a quarter-mile at a time. I can just see Vin, shaking his head in disappointment.


Radeon HD 6950 Video...
TigerDirect$279.99
Newegg.com$276.99
Amazon.com$276.99
LACC.com$326.25
But come on already, guys! The AMD Radeon HD 6990 was supposed to be a 2010 model, and here we are in March wondering if AMD overpromised during its press briefing last October. We even heard rumors that the 6990 was canceled.


Au contraire, Pierre. It looks like AMD is making the first move with its blown Charger, daring Nvidia to throw-down with a twin sequential turbo-charged Supra...you probably know it as the rumored GeForce GTX 590. We received a single Radeon HD 6990 4 GB one week ago, beta drivers a couple of days later, and updated Catalyst Application Profiles a couple of days after that. Needless to say, the benchmarking marathon that went on in our Bakersfield, CA lab made the 24 Hours of Le Mans look like kart racing at an amusement park.




Zoom


Meet Radeon HD 6990 4 GB


It just sounds majestic, doesn’t it? 6990. 4 GB. Unlike anything we’ve ever seen from AMD on the desktop. But don’t let naming trickery disarm you like the beautiful rosso corsa of Ferrari’s race cars.


The Radeon HD 6990 follows in the pedigree of Radeon HD 4870 X2 and Radeon HD 5970. It’s a dual-GPU card with graphics processors running, by default, at slightly reduced clock speeds compared to the company’s fastest single-chip board. Its 4 GB of memory are divided between both ASICs. So, you’re essentially looking at two 2 GB configurations on a single PCB, running in CrossFire.


Although it was previously referred to by the code name Antilles, Radeon HD 6990 centers on two of the Cayman-based GPUs found in Radeon HD 6970 and 6950 graphics cards. If you remember from Radeon HD 6970 And 6950 Review: Is Cayman A Gator Or A Crock?, Cayman employs a slightly modified architecture, designed to extract more performance per square millimeter of die space. There are situations where this VLIW4 architecture could underperform AMD's older VLIW5 design, but the company says those situations are rare.




Old VLIW5: Cypress


Bottom line: the highest-end Cayman configuration offers fewer ALUs than the most complex Cypress processor (found in the Radeon HD 5800-series cards). However, Cayman’s ALUs are more capable. For a deeper background on Cayman’s architecture, check the second page of our launch coverage.




New VLIW4: Cayman


Each Cayman GPU serves up 1536 ALUs spread across 24 SIMDs. SIMDs are tied to four texture units, totaling 96. Radeon HD 6990 utilizes Cayman in its uncut form, so you get 3072 ALUs and 192 texture units between the pair of GPUs. As mentioned, the 4 GB frame buffer is divided up, 2 GB of GDDR5 per processor, connected via a 256-bit bus.


AMD unifies the two Cayman GPUs using the exact same 48-lane PCI Express 2.0 switch from PLX found on the Radeon HD 5970. Sixteen of those lanes serve the slot interface, 16 go to GPU 1, and 16 go to GPU 2.




Radeon HD 6990
Radeon HD 6970
Radeon HD 6950
GeForce GTX 580
Manufacturing Process
40 nm TSMC 40 nm TSMC
40 nm TSMC
40 nm TSMC
Die Size
2 x 389 mm² 389 mm² 389 mm² 520 mm²
Transistors
2 x 2.64 billion 2.64 billion
2.64 billion
3 billion
Engine Clock
830 MHz 880 MHz
800 MHz
772 MHz
Stream Processors / CUDA Cores
3072
1536
1408
512
Compute Performance
5.1 TFLOPS
2.7 TFLOPS
2.25 TFLOPS
1.58 TFLOPS
Texture Units
192
96
88
64
Texture Fillrate
159.4 Gtex/s
84.5 Gtex/s
70.4 Gtex/s
49.4 Gtex/s
ROPs
64
32
32
48
Pixel Fillrate
53.1 Gpix/s
28.2 Gpix/s
25.6 Gpix/s
37.1 Gpix/s
Frame Buffer
4 GB GDDR5
2 GB GDDR5
2 GB GDDR5
1.5 GB GDDR5
Memory Clock
1250 MHz
1375 MHz
1250 MHz
1002 MHz
Memory Bandwidth
2 x 160 GB/s (256-bit) 176 GB/s (256-bit)
160 GB/s (256-bit)
192 GB/s (384-bit)
Maximum Board Power
375 W
250 W
200 W
244 W


Of course, we’re ecstatic that AMD is using fully-functional 40 nm Cayman GPUs—the kind you’d find on a Radeon HD 6970. But that product is already rated for up to 250 W maximum board power. Keeping the 6990’s thermal output manageable meant turning down the clocks from 880 MHz (Radeon HD 6970) to 830 MHz (Radeon HD 6990). AMD also uses a lower memory clock (1250 MHz rather than 1375 MHz). The resulting compute power adds up to 5.1 TFLOPS of single-precision math or 1.27 TFLOPS double-precision

GAME HISTORY


GAME HISTORY

Spacewar, developed for the PDP-1 in 1961, is often credited as being the second ever computer game. The game consisted of two player-controlled spaceships maneuvering around a central star, each attempting to destroy the other.

Although personal computers only became popular with the development of the microprocessor, mainframe and minicomputers, computer gaming has existed since at least the 1950s. OXO, an adaptation of tic-tac-toe for the EDSAC, debuted in 1952. Another pioneer computer game was developed in 1961, when MIT students Martin Graetz and Alan Kotok, with MIT student Steve Russell, developed Spacewar! on a PDP-1 computer used for statistical calculations.
The first generation of PC games were often text adventures or interactive fiction, in which the player communicated with the computer by entering commands through a keyboard. An early text-adventure, Adventure, was developed for the PDP-11 by Will Crowther in 1976, and expanded by Don Woods in 1977. By the 1980s, personal computers had become powerful enough to run games like Adventure, but by this time, graphics were beginning to become an important factor in games. Later games combined textual commands with basic graphics, as seen in the SSI Gold Box games such as Pool of Radiance, or Bard's Tale.

By the mid-1970s, games were developed and distributed through hobbyist groups and gaming magazines, such as Creative Computing and later Computer Gaming World. These publications provided game code that could be typed into a computer and played, encouraging readers to submit their own software to competitions.
Microchess was one of the first games for microcomputers which was sold to the public. First sold in 1977, Microchess eventually sold over 50,000 copies on cassette tape.
Industry crash
Main article: Video game crash of 1983
As the video game market became flooded with poor-quality cartridge games created by numerous companies attempting to enter the market, and over-production of high profile releases such as the Atari 2600 adaptation of E.T. and Pacman grossly underperformed, the popularity of personal computers for education rose dramatically. In 1983, consumer interest in console video games dwindled to historical lows, as interest in computer games rose.
The effects of the crash were largely limited to the console market, as established companies such as Atari posted record losses over subsequent years. Conversely, the home computer market boomed, as sales of low-cost color computers such as the Commodore 64 rose to record highs and developers such as Electronic Arts benefited from increasing interest in the platform.
The console market experienced a resurgence in the United States with the release of the Nintendo Entertainment System. In Europe, computer gaming continued to boom for many years after.
New genres
Increasing adoption of the computer mouse, driven partially by the success of games such as the highly successful King's Quest series, and high resolution bitmap displays allowed the industry to include increasingly high-quality graphical interfaces in new releases. Meanwhile, the Commodore Amiga computer achieved great success in the market from its release in 1985, contributing to the rapid adoption of these new interface technologies.
Further improvements to game artwork were made possible with the introduction of the first sound cards, such as AdLib's Music Synthesizer Card, in 1987. These cards allowed IBM PC compatible computers to produce complex sounds using FM synthesis, where they had previously been limited to simple tones and beeps. However, the rise of the Creative Labs Sound Blaster card, which featured much higher sound quality due to the inclusion of a PCM channel and digital signal processor, led AdLib to file for bankruptcy in 1992.
The year before, id Software had produced one of the first first-person shooter games, Hovertank 3D, which was the company's first in their line of highly influential games in the genre. There were other companies that made fps shooters such as Day of the Viper from the company Accolade made in 1989. Id Software went on to develop Wolfenstein 3D in 1992, which helped to popularize the genre, kick-starting a genre that would become one of the highest-selling in modern times. The game was originally distributed through the shareware distribution model, allowing players to try a limited part of the game for free but requiring payment to play the rest, and represented one of the first uses of texture mapping graphics in a popular game, along with Ultima Underworld.
While leading Sega and Nintendo console systems kept their CPU speed at 3-7 MHz, the 486 PC processor ran much faster, allowing it to perform many more calculations per second. The 1993 release of Doom on the PC was a breakthrough in 3D graphics, and was soon ported to various game consoles in a general shift toward greater realism.In the same time frame, games such as Myst took advantage of the new CD-ROM delivery format to include many more assets (sound, images, video) for a richer game experience.
Many early PC games included extras such as the peril-sensitive sunglasses that shipped with The Hitchhiker's Guide to the Galaxy. These extras gradually became less common, but many games were still sold in the traditional over-sized boxes that used to hold the extra "feelies". Today, such extras are usually found only in Special Edition versions of games, such Contemporary gaming
By 1996, the rise of Microsoft Windows and success of 3D console titles such as Super Mario 64 sparked great interest in hardware accelerated 3D graphics on the IBM PC compatible, and soon resulted in attempts to produce affordable solutions with the ATI Rage, Matrox Mystique and Silicon Graphics ViRGE. Tomb Raider, which was released in 1996, was one of the first third person shooter games and was praised for its revolutionary graphics. As 3D graphics libraries such as DirectX and OpenGL matured and knocked proprietary interfaces out of the market, these platforms gained greater acceptance in the market, particularly with their demonstrated benefits in games such as Unreal. However, major changes to the Microsoft Windows operating system, by then the market leader, made many older MS-DOS-based games unplayable on Windows NT, and later, Windows XP (without using an emulator, such as DOSbox)
The faster graphics accelerators and improving CPU technology resulted in increasing levels of realism in computer games. During this time, the improvements introduced with products such as ATI's Radeon R300 and NVidia's GeForce 6 Series have allowed developers to increase the complexity of modern game engines. PC gaming currently tends strongly toward improvements in 3D graphics.
Unlike the generally accepted push for improved graphical performance, the use of physics engines in computer games has become a matter of debate since announcement and 2005 release of the nVidia PhysX PPU, ostensibly competing with middleware such as the Havok physics engine. Issues such as difficulty in ensuring consistent experiences for all players,and the uncertain benefit of first generation PhysX cards in games such as Tom Clancy's Ghost Recon Advanced Warfighter and City of Villains, prompted arguments over the value of such technology.
Similarly, many game publishers began to experiment with new forms of marketing. Chief among these alternative strategies is episodic gaming, an adaptation of the older concept of expansion packs, in which game content is provided in smaller quantities but for a proportionally lower price. Titles such as Half-Life 2: Episode One took advantage of the idea, with mixed results rising from concerns for the amount of content provided for the price

PC game development

Main article: Game development
Game development, as with console games, is generally undertaken by one or more game developers using either standardized or proprietary tools. While games could previously be developed by very small groups of people, as in the early example of Wolfenstein 3D, many popular computer games today require large development teams and budgets running into the millions of dollars.

PC games are usually built around a central piece of software, known as a game engine,[23] that simplifies the development process and enables developers to easily port their projects between platforms. Unlike most consoles, which generally only run major engines such as Unreal Engine 3 and RenderWare due to restrictions on homebrew software, personal computers may run games developed using a larger range of software. As such, a number of alternatives to expensive engines have become available, including open source solutions such as Crystal Space, OGRE and DarkPlaces.
User-created modifications
Main article: Mod (computer gaming)
The multi-purpose nature of personal computers often allows users to modify the content of installed games with relative ease. Since console games are generally difficult to modify without a proprietary software development kit, and are often protected by legal and physical barriers against tampering and homebrew software,[24][25] it is generally easier to modify the personal computer version of games using common, easy-to-obtain software. Users can then distribute their customised version of the game (commonly known as a mod) by any means they choose.
The inclusion of map editors such as UnrealEd with the retail versions of many games, and others that have been made available online such as GtkRadiant, allow users to create modifications for games easily, using tools that are maintained by the games' original developers. In addition, companies such as id Software have released the source code to older game engines, enabling the creation of entirely new games and major changes to existing ones.
Modding had allowed much of the community to produce game elements that would not normally be provided by the developer of the game, expanding or modifying normal gameplay to varying degrees. Maybe the most notable example is Counter Strike,a mod for Half Life, counter strike turns the initial adventure FPS into a round based,tactical FPS.
Distribution

Physical distribution
Computer games are typically sold on standard storage media, such as compact discs, DVD, and floppy disks. These were originally passed on to customers through mail order services,[28] although retail distribution has replaced it as the main distribution channel for video games due to higher sales.Cassette tapes and different formats of floppy disks were initially the staple storage media of the 1980s and early 1990s, but have fallen out of practical use as the increasing sophistication of computer games raised the overall size of the game's data and program files.
The introduction of complex graphics engines in recent times has resulted in additional storage requirements for modern games, and thus an increasing interest in CDs and DVDs as the next compact storage media for personal computer games. The rising popularity of DVD drives in modern PCs, and the larger capacity of the new media (a single-layer DVD can hold up to 4.7 gigabytes of data, more than five times as much as a single CD), have resulted in their adoption as a format for computer game distribution. To date, CD versions are still offered for most games, while some games offer both the CD and the DVD versions.
Shareware
Main articles: Shareware and Game demo
Shareware marketing, whereby a limited or demonstration version of the full game is released to prospective buyers without charge, has been used as a method of distributing computer games since the early years of the gaming industry and was seen in the early days of Tanarus as well as many others. Shareware games generally offer only a small part of the gameplay offered in the retail product, and may be distributed with gaming magazines, in retail stores or on developers' websites free of charge.
In the early 1990s, shareware distribution was common among fledging game companies such as Apogee Software, Epic Megagames and id Software, and remains a popular distribution method among smaller game developers. However, shareware has largely fallen out of favor among established game companies in favour of traditional retail marketing, with notable exceptions such as Big Fish Games and PopCap Games continuing to use the model today
Online delivery
With the increased popularity of the Internet, online distribution of game content has become more common Retail services such as Direct2Drive and Download.com allow users to purchase and download large games that would otherwise only be distributed on physical media, such as DVDs, as well as providing cheap distribution of shareware and demonstration games. Companies such as Real Networks provide a service that allows other websites to use their game catalog and ecommerce backend to publish their own game download distribution sites. Other services, allow a subscription-based distribution model in which users pay a monthly fee to download and play as many games as they wish.
The Steam system, developed by Valve Corporation, provides an alternative to traditional online services. Instead of allowing the player to download a game and play it immediately, games are made available for "pre-load" in an encrypted form days or weeks before their actual release date. On the official release date, a relatively small component is made available to unlock the game. Steam also ensures that once bought, a game remains accessible to a customer indefinitely, while traditional mediums such as floppy disks and CD-ROMs are susceptible to unrecoverable damage and misplacement. The user would however depend on the Steam servers to be online to download its games. According to the terms of service for Steam, Valve has no obligation to keep the servers running. Therefore, if the Valve Corporation shut down, so would the servers. However, they have stated that if the service was to be discontinued, games would no longer require authorization from the servers to run. Nevertheless, they are not obligated to do so.
PC game genres

See also: Video game genres
The real-time strategy genre, which accounts for more than a quarter of all PC games sold,has found very little success on video game consoles, with releases such as Starcraft 64 failing in the marketplace. Real-time strategy games tend to suffer from the design of console controllers, which do not allow fast, accurate movement.


PC gaming technology


An exploded view of a modern personal computer:
Display
Motherboard
CPU (Microprocessor)
Primary storage (RAM)
Expansion cards (graphics cards, etc.)
Power supply
Optical disc drive
Secondary storage (Hard disk)
Keyboard
Mouse
Main article: Personal computer
Hardware
Modern computer games place great demand on the computer's hardware, often requiring a fast central processing unit (CPU) to function properly. CPU manufacturers historically relied mainly on increasing clock rates to improve the performance of their processors, but had begun to move steadily towards multi-core CPUs by 2005. These processors allow the computer to simultaneously process multiple tasks, called threads, allowing the use of more complex graphics, artificial intelligence and in-game physics.
Similarly, 3D games often rely on a powerful graphics processing unit (GPU), which accelerates the process of drawing complex scenes in realtime. GPUs may be an integrated part of the computer's motherboard, the most common solution in laptops,[35] or come packaged with a discrete graphics card with a supply of dedicated Video RAM, connected to the motherboard through either an AGP or PCI-Express port. It is also possible to use multiple GPUs in a single computer, using technologies such as NVidia's Scalable Link Interface and ATI's CrossFire.
Sound cards are also available to provide improved audio in computer games. These cards provide improved 3D audio and provide audio enhancement that is generally not available with integrated alternatives, at the cost of marginally lower overall performance The Creative Labs SoundBlaster line was for many years the de facto standard for sound cards, although its popularity dwindled as PC audio became a commodity on modern motherboards.
Physics processing units (PPUs), such as the Nvidia PhysX (formerly AGEIA PhysX) card, are also available to accelerate physics simulations in modern computer games. PPUs allow the computer to process more complex interactions among objects than is achievable using only the CPU, potentially allowing players a much greater degree of control over the world in games designed to use the card.
Virtually all personal computers use a keyboard and mouse for user input. Other common gaming peripherals are a headset for faster communication in online games, joysticks for flight simulators, steering wheels for driving games and gamepads for console-style games.
[edit]Software
Computer games also rely on third-party software such as an operating system (OS), device drivers, libraries and more to run. Today, the vast majority of computer games are designed to run on the Microsoft Windows family of operating systems. Whereas earlier games written for MS-DOS would include code to communicate directly with hardware, today Application programming interfaces (APIs) provide an interface between the game and the OS, simplifying game design. Microsoft's DirectX is an API that is widely used by today's computer games to communicate with sound and graphics hardware. OpenGL is a cross-platform API for graphics rendering that is also used. The version of the graphics card's driver installed can often affect game performance and gameplay. It is not unusual for a game company to use a third-party game engine, or third-party libraries for a game's AI or physics.

Multiplayer

Local area network gaming

Multiplayer gaming was largely limited to local area networks (LANs) before cost-effective broadband Internet access became available, due to their typically higher bandwidth and lower latency than the dial-up services of the time. These advantages allowed more players to join any given computer game, but have persisted today because of the higher latency of most Internet connections and the costs associated with broadband Internet.
LAN gaming typically requires two or more personal computers, a router and sufficient networking cables to connect every computer on the network. Additionally, each computer must have a network card in order to communicate with other computers on the network, and its own copy of the game in order to play. Optionally, any LAN may include an external connection to the Internet.

Online games
Main article: Online game

Online multiplayer games have achieved popularity largely as a result of increasing broadband adoption among consumers. Affordable high-bandwidth Internet connections allow large numbers of players to play together, and thus have found particular use in massively multiplayer online role-playing games, Tanarus and persistent online games such as World War II Online.
Although it is possible to participate in online computer games using dial-up modems, broadband internet connections are generally considered necessary in order to reduce the latency between players (commonly known as "lag"). Such connections require a broadband-compatible modem connected to the personal computer through a network interface card (generally integrated onto the computer's motherboard), optionally separated by a router. Online games require a virtual environment, generally called a "game server". These virtual servers inter-connect gamers, allowing real time, and often fast paced action. To meet this subsequent need, Game Server Providers (GSP) have become increasingly more popular over the last half decade. While not required for all gamers, these servers provide a unique "home", fully customizable (such as additional modifications, settings, etc.) - giving the end gamers the experience they desire. Today there are over 510,000 game servers hosted in North America alone.


Emulation
Main article: Emulator

Emulation software, used to run software without the original hardware, are popular for their ability to play legacy video games without the consoles or operating system for which they were designed. The operating system emulators include DOSBox, a DOS emulator which allows playing games developed originally for this operating system and thus not compatible with a modern day OS. Console emulators such as NESticle and MAME are relatively commonplace, although the complexity of modern consoles such as the Xbox or PlayStation makes them far more difficult to emulate, even for the original manufacturers.

Most emulation software mimics a particular hardware architecture, often to an extremely high degree of accuracy. This is particularly the case with classic home computers such as the Commodore 64, whose software often depends on highly sophisticated low-level programming tricks invented by game programmers and the demoscene.

Controversy

Main article: Video game controversy
PC games have long been a source of controversy, particularly related to the violence that has become commonly associated with video gaming in general. The debate surrounds the influence of objectionable content on the social development of minors, with organisations such as the American Psychological Association concluding that video game violence increases children's aggression,
a concern that prompted a further investigation by the Center for Disease Control in September 2006.
Industry groups have responded by noting the responsibility of parents in governing their children's activities, while attempts in the United States to control the sale of objectionable games have generally been found unconstitutional.

Video game addiction is another cultural aspect of gaming to draw criticism as it can have a negative influence on health and on social relations. The problem of addiction and its health risks seems to have grown with the rise of Massively Multiplayer Online Role Playing Games (MMORPGs).
Alongside the social and health problems associated with computer game addiction have grown similar worries about the effect of computer games on education.