Check out our other new & used items>>>>>HERE! (click me)

FOR SALE:
A unique, animal-themed, media holder for jewel cases
SHARK MEDIA STORAGE RACK (YOUR CHOICE OF GOLDEN YELLOW, HYACINTH BLUE, PINK, OR LIGHT GREEN)

DETAILS:
Please let us know which color you would like when ordering
We currently have: 3 yellow, 3 hyacinth blue, 4 pink, 2 light green 

Store your favorite media on this quirky, shark-shaped storage rack!
This all-plastic media rack holds 12 CD, CD-ROM/PC game, Playstation 1, Dreamcast, CD-R, CD-RW, etc. cases along the length of the shark's back (the "14CD" on the label can be misleading). Upgrade your CD storage game with this unique Shark Shaped CD Case Rack. Showcase your love for both sharks and vintage fighter planes while keeping your media collection organized and protected. Its compact size makes it ideal for desks, shelves, or entertainment centers, allowing you to save space without compromising style.

Whether you're a collector, gamer, or simply looking to add a touch of personality to your room, this CD rack is perfect for you! Don't miss out on this opportunity to own a truly unique piece that will undoubtedly spark conversations and admiration.

Holds up to 12 jewel cases, keeping your CDs and games neatly organized and easily accessible.

Choose from various vibrant colors, allowing you to customize the rack to match your unique style and room decor.

The shark design is reminiscent of a World War II fighter plane, featuring a fierce
shark face front for an extra touch of personality.

Made from durable materials, ensuring long-lasting use and protection for your media.

Ships flat, and the assembly is simple, allowing you to enjoy your new CD rack in no time.

Holds more than media!
Its intended purpose is to hold media cases but this lovable shark can be used for more. Use it in the office or home for organizing mail. Its possible to store small, thin books on this rack as well. Think of the possibilities! Drying rack of some sort? If you love sharks you'll find a use!

Dimensions:
Assembled: approximately 15 3/4" (L) x 10 3/8" (H) x 3 3/4" (W)

CONDITION:
New in package. Please see photos.

THANK YOU FOR LOOKING. QUESTIONS? JUST ASK.
*ALL PHOTOS AND TEXT ARE INTELLECTUAL PROPERTY OF SIDEWAYS STAIRS CO. ALL RIGHTS RESERVED.*










"Sharks are a group of elasmobranch fish characterized by a cartilaginous skeleton, five to seven gill slits on the sides of the head, and pectoral fins that are not fused to the head. Modern sharks are classified within the clade Selachimorpha (or Selachii) and are the sister group to the rays. However, the term "shark" has also been used for extinct members of the subclass Elasmobranchii outside the Selachimorpha, such as Cladoselache and Xenacanthus, as well as other Chondrichthyes such as the holocephalid eugenedontidans.

Under this broader definition, the earliest known sharks date back to more than 420 million years ago.[2] Acanthodians are often referred to as "spiny sharks"; though they are not part of Chondrichthyes proper, they are a paraphyletic assemblage leading to cartilaginous fish as a whole. Since then, sharks have diversified into over 500 species. They range in size from the small dwarf lanternshark (Etmopterus perryi), a deep sea species of only 17 centimetres (6.7 in) in length, to the whale shark (Rhincodon typus), the largest fish in the world, which reaches approximately 12 metres (40 ft) in length.[3] Sharks are found in all seas and are common to depths of 2,000 metres (6,600 ft). They generally do not live in freshwater although there are a few known exceptions, such as the bull shark and the river shark, which can be found in both seawater and freshwater.[4] Sharks have a covering of dermal denticles that protects their skin from damage and parasites in addition to improving their fluid dynamics. They have numerous sets of replaceable teeth.[5]

Well-known species such as the great white shark, tiger shark, blue shark, mako shark, thresher shark, and hammerhead shark are apex predators—organisms at the top of their underwater food chain. Many shark populations are threatened by human activities....

Until the 16th century,[6] sharks were known to mariners as "sea dogs".[7] This is still evidential in several species termed "dogfish," or the porbeagle.

The etymology of the word "shark" is uncertain, the most likely etymology states that the original sense of the word was that of "predator, one who preys on others" from the Dutch schurk, meaning "villain, scoundrel" (cf. card shark, loan shark, etc.), which was later applied to the fish due to its predatory behaviour.[8]

A now disproven theory is that it derives from the Yucatec Maya word xok (pronounced 'shok'), meaning "fish".[9] Evidence for this etymology came from the Oxford English Dictionary, which notes shark first came into use after Sir John Hawkins' sailors exhibited one in London in 1569 and posted "sharke" to refer to the large sharks of the Caribbean Sea. However, the Middle English Dictionary records an isolated occurrence of the word shark (referring to a sea fish) in a letter written by Thomas Beckington in 1442, which rules out a New World etymology....

 Evidence for the existence of sharks dates from the Ordovician period, 450–420 million years ago, before land vertebrates existed and before a variety of plants had colonized the continents.[2] Only scales have been recovered from the first sharks and not all paleontologists agree that these are from true sharks, suspecting that these scales are actually those of thelodont agnathans.[11] The oldest generally accepted shark scales are from about 420 million years ago, in the Silurian period.[11] The first sharks looked very different from modern sharks.[12] At this time the most common shark tooth is the cladodont, a style of thin tooth with three tines like a trident, apparently to help catch fish. The majority of modern sharks can be traced back to around 100 million years ago.[13] Most fossils are of teeth, often in large numbers. Partial skeletons and even complete fossilized remains have been discovered. Estimates suggest that sharks grow tens of thousands of teeth over a lifetime, which explains the abundant fossils. The teeth consist of easily fossilized calcium phosphate, an apatite. When a shark dies, the decomposing skeleton breaks up, scattering the apatite prisms. Preservation requires rapid burial in bottom sediments.

Among the most ancient and primitive sharks is Cladoselache, from about 370 million years ago,[12] which has been found within Paleozoic strata in Ohio, Kentucky, and Tennessee. At that point in Earth's history these rocks made up the soft bottom sediments of a large, shallow ocean, which stretched across much of North America. Cladoselache was only about 1 metre (3.3 ft) long with stiff triangular fins and slender jaws.[12] Its teeth had several pointed cusps, which wore down from use. From the small number of teeth found together, it is most likely that Cladoselache did not replace its teeth as regularly as modern sharks. Its caudal fins had a similar shape to the great white sharks and the pelagic shortfin and longfin makos. The presence of whole fish arranged tail-first in their stomachs suggest that they were fast swimmers with great agility.

Most fossil sharks from about 300 to 150 million years ago can be assigned to one of two groups. The Xenacanthida was almost exclusive to freshwater environments.[14][15] By the time this group became extinct about 220 million years ago, they had spread worldwide. The other group, the hybodonts, appeared about 320 million years ago and lived mostly in the oceans, but also in freshwater.[citation needed] The results of a 2014 study of the gill structure of an unusually well preserved 325-million-year-old fossil suggested that sharks are not "living fossils", but rather have evolved more extensively than previously thought over the hundreds of millions of years they have been around.[16]

Modern sharks began to appear about 100 million years ago.[13] Fossil mackerel shark teeth date to the Early Cretaceous. One of the most recently evolved families is the hammerhead shark (family Sphyrnidae), which emerged in the Eocene.[17] The oldest white shark teeth date from 60 to 66 million years ago, around the time of the extinction of the dinosaurs. In early white shark evolution there are at least two lineages: one lineage is of white sharks with coarsely serrated teeth and it probably gave rise to the modern great white shark, and another lineage is of white sharks with finely serrated teeth. These sharks attained gigantic proportions and include the extinct megatoothed shark, C. megalodon. Like most extinct sharks, C. megalodon is also primarily known from its fossil teeth and vertebrae. This giant shark reached a total length (TL) of more than 16 metres (52 ft).[18][19] C. megalodon may have approached a maxima of 20.3 metres (67 ft) in total length and 103 metric tons (114 short tons) in mass.[20] Paleontological evidence suggests that this shark was an active predator of large cetaceans." (wikipedia.org)

"Sharks are a group of elasmobranch fish characterized by a cartilaginous skeleton, five to seven gill slits on the sides of the head, and pectoral fins that are not fused to the head. Modern sharks are classified within the clade Selachimorpha (or Selachii) and are the sister group to the Batoidea (rays and kin). Some sources extend the term "shark" as an informal category including extinct members of Chondrichthyes (cartilaginous fish) with a shark-like morphology, such as hybodonts. Shark-like chondrichthyans such as Cladoselache and Doliodus first appeared in the Devonian Period (419-359 Ma), though some fossilized chondrichthyan-like scales are as old as the Late Ordovician (458-444 Ma).[1] The oldest modern sharks (selachians) are known from the Early Jurassic, about 200 Ma.

Sharks range in size from the small dwarf lanternshark (Etmopterus perryi), a deep sea species that is only 17 centimetres (6.7 in) in length, to the whale shark (Rhincodon typus), the largest fish in the world, which reaches approximately 12 metres (40 ft) in length.[2] They are found in all seas and are common to depths up to 2,000 metres (6,600 ft). They generally do not live in freshwater, although there are a few known exceptions, such as the bull shark and the river shark, which can be found in both seawater and freshwater.[3] Sharks have a covering of dermal denticles that protects their skin from damage and parasites in addition to improving their fluid dynamics. They have numerous sets of replaceable teeth.[4]

Several species are apex predators, which are organisms that are at the top of their food chain. Select examples include the tiger shark, blue shark, great white shark, mako shark, thresher shark, and hammerhead shark.

Sharks are caught by humans for shark meat or shark fin soup. Many shark populations are threatened by human activities. Since 1970, shark populations have been reduced by 71%, mostly from overfishing.[5]
Etymology

Until the 16th century,[6] sharks were known to mariners as "sea dogs".[7] This is still evidential in several species termed "dogfish," or the porbeagle.

The etymology of the word shark is uncertain, the most likely etymology states that the original sense of the word was that of "predator, one who preys on others" from the Dutch schurk, meaning 'villain, scoundrel' (cf. card shark, loan shark, etc.), which was later applied to the fish due to its predatory behaviour.[8]

A now disproven[original research?] theory is that it derives from the Yucatec Maya word xook (pronounced [ʃoːk]), meaning 'shark'.[9] Evidence for this etymology came from the Oxford English Dictionary, which notes shark first came into use after Sir John Hawkins' sailors exhibited one in London in 1569 and posted "sharke" to refer to the large sharks of the Caribbean Sea. However, the Middle English Dictionary records an isolated occurrence of the word shark (referring to a sea fish) in a letter written by Thomas Beckington in 1442, which rules out a New World etymology....Anatomy
Main article: Shark anatomy
Drawing of a shark labeling major anatomical features, including mouth, snout, nostril, eye, spiracle, dorsal fin spine, caudal keel, clasper, labial furrows, gill openings, precaudal pit and fins: first and second dorsal, anal, pectoral, caudal and pelvic
General anatomical features of sharks
Teeth
Main article: Shark tooth
The serrated teeth of a tiger shark, used for sawing through flesh
The teeth of tiger sharks are oblique and serrated to saw through flesh

Shark teeth are embedded in the gums rather than directly affixed to the jaw, and are constantly replaced throughout life. Multiple rows of replacement teeth grow in a groove on the inside of the jaw and steadily move forward in comparison to a conveyor belt; some sharks lose 30,000 or more teeth in their lifetime. The rate of tooth replacement varies from once every 8 to 10 days to several months. In most species, teeth are replaced one at a time as opposed to the simultaneous replacement of an entire row, which is observed in the cookiecutter shark.[25]

Tooth shape depends on the shark's diet: those that feed on mollusks and crustaceans have dense and flattened teeth used for crushing, those that feed on fish have needle-like teeth for gripping, and those that feed on larger prey such as mammals have pointed lower teeth for gripping and triangular upper teeth with serrated edges for cutting. The teeth of plankton-feeders such as the basking shark are small and non-functional.[26]
Skeleton

Shark skeletons are very different from those of bony fish and terrestrial vertebrates. Sharks and other cartilaginous fish (skates and rays) have skeletons made of cartilage and connective tissue. Cartilage is flexible and durable, yet is about half the normal density of bone. This reduces the skeleton's weight, saving energy.[27] Because sharks do not have rib cages, they can easily be crushed under their own weight on land.[28]
Jaw

The jaws of sharks, like those of rays and skates, are not attached to the cranium. The jaw's surface (in comparison to the shark's vertebrae and gill arches) needs extra support due to its heavy exposure to physical stress and its need for strength. It has a layer of tiny hexagonal plates called "tesserae", which are crystal blocks of calcium salts arranged as a mosaic.[29] This gives these areas much of the same strength found in the bony tissue found in other animals.

Generally sharks have only one layer of tesserae, but the jaws of large specimens, such as the bull shark, tiger shark, and the great white shark, have two to three layers or more, depending on body size. The jaws of a large great white shark may have up to five layers.[27] In the rostrum (snout), the cartilage can be spongy and flexible to absorb the power of impacts.
Fins

Fin skeletons are elongated and supported with soft and unsegmented rays named ceratotrichia, filaments of elastic protein resembling the horny keratin in hair and feathers.[30] Most sharks have eight fins. Sharks can only drift away from objects directly in front of them because their fins do not allow them to move in the tail-first direction.[28]
Dermal denticles
Further information: Fish scale § Placoid scales
The dermal denticles of a lemon shark
The dermal denticles of a lemon shark, viewed through a scanning electron microscope

Unlike bony fish, sharks have a complex dermal corset made of flexible collagenous fibers and arranged as a helical network surrounding their body. This works as an outer skeleton, providing attachment for their swimming muscles and thus saving energy.[31] Their dermal teeth give them hydrodynamic advantages as they reduce turbulence when swimming.[32] Some species of shark have pigmented denticles that form complex patterns like spots (e.g. Zebra shark) and stripes (e.g. Tiger shark). These markings are important for camouflage and help sharks blend in with their environment, as well as making them difficult for prey to detect.[33] For some species, dermal patterning returns to healed denticles even after they have been removed by injury.[34]
Tails

Tails provide thrust, making speed and acceleration dependent on tail shape. Caudal fin shapes vary considerably between shark species, due to their evolution in separate environments. Sharks possess a heterocercal caudal fin in which the dorsal portion is usually noticeably larger than the ventral portion. This is because the shark's vertebral column extends into that dorsal portion, providing a greater surface area for muscle attachment. This allows more efficient locomotion among these negatively buoyant cartilaginous fish. By contrast, most bony fish possess a homocercal caudal fin.[35]

Tiger sharks have a large upper lobe, which allows for slow cruising and sudden bursts of speed. The tiger shark must be able to twist and turn in the water easily when hunting to support its varied diet, whereas the porbeagle shark, which hunts schooling fish such as mackerel and herring, has a large lower lobe to help it keep pace with its fast-swimming prey.[36] Other tail adaptations help sharks catch prey more directly, such as the thresher shark's usage of its powerful, elongated upper lobe to stun fish and squid....In culture
Shark-themed nose art, made popular by the Flying Tigers (pictured), is commonly seen on military aircraft.
In Hawaii

Sharks figure prominently in Hawaiian mythology. Stories tell of men with shark jaws on their back who could change between shark and human form. A common theme was that a shark-man would warn beach-goers of sharks in the waters. The beach-goers would laugh and ignore the warnings and get eaten by the shark-man who warned them. Hawaiian mythology also includes many shark gods. Among a fishing people, the most popular of all aumakua, or deified ancestor guardians, are shark aumakua. Kamaku describes in detail how to offer a corpse to become a shark. The body transforms gradually until the kahuna can point the awe-struck family to the markings on the shark's body that correspond to the clothing in which the beloved's body had been wrapped. Such a shark aumakua becomes the family pet, receiving food, and driving fish into the family net and warding off danger. Like all aumakua it had evil uses such as helping kill enemies. The ruling chiefs typically forbade such sorcery. Many Native Hawaiian families claim such an aumakua, who is known by name to the whole community.[109]

Kamohoali'i is the best known and revered of the shark gods, he was the older and favored brother of Pele,[110] and helped and journeyed with her to Hawaii. He was able to assume all human and fish forms. A summit cliff on the crater of Kilauea is one of his most sacred spots. At one point he had a heiau (temple or shrine) dedicated to him on every piece of land that jutted into the ocean on the island of Molokai. Kamohoali'i was an ancestral god, not a human who became a shark and banned the eating of humans after eating one herself.[111][112] In Fijian mythology, Dakuwaqa was a shark god who was the eater of lost souls.
In American Samoa

On the island of Tutuila in American Samoa (a U.S. territory), there is a location called Turtle and Shark (Laumei ma Malie) which is important in Samoan culture — the location is the site of a legend called O Le Tala I Le Laumei Ma Le Malie, in which two humans are said to have transformed into a turtle and a shark.[113][114][115] According to the U.S. National Park Service, "Villagers from nearby Vaitogi continue to reenact an important aspect of the legend at Turtle and Shark by performing a ritual song intended to summon the legendary animals to the ocean surface, and visitors are frequently amazed to see one or both of these creatures emerge from the sea in apparent response to this call."[113]
In popular culture
Main article: Sharks in popular culture

In contrast to the complex portrayals by Hawaiians and other Pacific Islanders, the European and Western view of sharks has historically been mostly of fear and malevolence.[116] Sharks are used in popular culture commonly as eating machines, notably in the Jaws novel and the film of the same name, along with its sequels.[117] Sharks are threats in other films such as Deep Blue Sea, The Reef, and others, although they are sometimes used for comedic effect such as in Finding Nemo and the Austin Powers series. Sharks tend to be seen quite often in cartoons whenever a scene involves the ocean. Such examples include the Tom and Jerry cartoons, Jabberjaw, and other shows produced by Hanna-Barbera. They also are used commonly as a clichéd means of killing off a character that is held up by a rope or some similar object as the sharks swim right below them, or the character may be standing on a plank above shark infested waters." (wikipedia.org)

"In culture
Shark-themed nose art, made popular by the Flying Tigers (pictured), is commonly seen on military aircraft.
In Hawaii

Sharks figure prominently in Hawaiian mythology. Stories tell of men with shark jaws on their back who could change between shark and human form. A common theme was that a shark-man would warn beach-goers of sharks in the waters. The beach-goers would laugh and ignore the warnings and get eaten by the shark-man who warned them. Hawaiian mythology also includes many shark gods. Among a fishing people, the most popular of all aumakua, or deified ancestor guardians, are shark aumakua. Kamaku describes in detail how to offer a corpse to become a shark. The body transforms gradually until the kahuna can point the awe-struck family to the markings on the shark's body that correspond to the clothing in which the beloved's body had been wrapped. Such a shark aumakua becomes the family pet, receiving food, and driving fish into the family net and warding off danger. Like all aumakua it had evil uses such as helping kill enemies. The ruling chiefs typically forbade such sorcery. Many Native Hawaiian families claim such an aumakua, who is known by name to the whole community.[109]

Kamohoali'i is the best known and revered of the shark gods, he was the older and favored brother of Pele,[110] and helped and journeyed with her to Hawaii. He was able to assume all human and fish forms. A summit cliff on the crater of Kilauea is one of his most sacred spots. At one point he had a heiau (temple or shrine) dedicated to him on every piece of land that jutted into the ocean on the island of Molokai. Kamohoali'i was an ancestral god, not a human who became a shark and banned the eating of humans after eating one herself.[111][112] In Fijian mythology, Dakuwaqa was a shark god who was the eater of lost souls.
In American Samoa

On the island of Tutuila in American Samoa (a U.S. territory), there is a location called Turtle and Shark (Laumei ma Malie) which is important in Samoan culture — the location is the site of a legend called O Le Tala I Le Laumei Ma Le Malie, in which two humans are said to have transformed into a turtle and a shark.[113][114][115] According to the U.S. National Park Service, "Villagers from nearby Vaitogi continue to reenact an important aspect of the legend at Turtle and Shark by performing a ritual song intended to summon the legendary animals to the ocean surface, and visitors are frequently amazed to see one or both of these creatures emerge from the sea in apparent response to this call."[113]
In popular culture
Main article: Sharks in popular culture

In contrast to the complex portrayals by Hawaiians and other Pacific Islanders, the European and Western view of sharks has historically been mostly of fear and malevolence.[116] Sharks are used in popular culture commonly as eating machines, notably in the Jaws novel and the film of the same name, along with its sequels.[117] Sharks are threats in other films such as Deep Blue Sea, The Reef, and others, although they are sometimes used for comedic effect such as in Finding Nemo and the Austin Powers series. Sharks tend to be seen quite often in cartoons whenever a scene involves the ocean. Such examples include the Tom and Jerry cartoons, Jabberjaw, and other shows produced by Hanna-Barbera. They also are used commonly as a clichéd means of killing off a character that is held up by a rope or some similar object as the sharks swim right below them, or the character may be standing on a plank above shark infested waters." (wikipedia.org)

"The compact disc (CD) is a digital optical disc data storage format that was co-developed by Philips and Sony to store and play digital audio recordings. In August 1982, the first compact disc was manufactured. It was then released in October 1982 in Japan and branded as Digital Audio Compact Disc. It was released on March 2, 1983 in North America and Europe.

The format was later adapted (as CD-ROM) for general-purpose data storage. Several other formats were further derived, including write-once audio and data storage (CD-R), rewritable media (CD-RW), Video CD (VCD), Super Video CD (SVCD), Photo CD, Picture CD, Compact Disc-Interactive (CD-i) and Enhanced Music CD.

Standard CDs have a diameter of 120 millimetres (4.7 in) and are designed to hold up to 74 minutes of uncompressed stereo digital audio or about 650 MiB of data. Capacity is routinely extended to 80 minutes and 700 MiB by arranging data more closely on the same-sized disc. The Mini CD has various diameters ranging from 60 to 80 millimetres (2.4 to 3.1 in); they are sometimes used for CD singles, storing up to 24 minutes of audio, or delivering device drivers.

At the time of the technology's introduction in 1982, a CD could store much more data than a personal computer hard disk drive, which would typically hold 10 MiB. By 2010, hard drives commonly offered as much storage space as a thousand CDs, while their prices had plummeted to commodity levels. In 2004, worldwide sales of audio CDs, CD-ROMs, and CD-Rs reached about 30 billion discs. By 2007, 200 billion CDs had been sold worldwide.[3]
Physical details
    
This section needs additional citations for verification. Relevant discussion may be found on the talk page. Please help improve this article by adding citations to reliable sources in this section. Unsourced material may be challenged and removed. (May 2016) (Learn how and when to remove this template message)
See also: Shaped compact disc
Diagram of CD layers

    A polycarbonate disc layer has the data encoded by using bumps.
    A shiny layer reflects the laser.
    A layer of lacquer protects the shiny layer.
    Artwork is screen printed on the top of the disc.
    A laser beam is reflected off the CD to a sensor, which converts it into electronic data.

A CD is made from 1.2-millimetre (0.047 in) thick, polycarbonate plastic, and weighs 14–33 grams.[4] From the center outward, components are: the center spindle hole (15 mm), the first-transition area (clamping ring), the clamping area (stacking ring), the second-transition area (mirror band), the program (data) area, and the rim. The inner program area occupies a radius from 25 to 58 mm.

A thin layer of aluminum or, more rarely, gold is applied to the surface, making it reflective. The metal is protected by a film of lacquer normally spin coated directly on the reflective layer. The label is printed on the lacquer layer, usually by screen printing or offset printing.
Pits and Lands of a compact disc under a microscope

CD data is represented as tiny indentations known as pits, encoded in a spiral track moulded into the top of the polycarbonate layer. The areas between pits are known as lands. Each pit is approximately 100 nm deep by 500 nm wide, and varies from 850 nm to 3.5 µm in length.[5] The distance between the tracks (the pitch) is 1.6 µm.[6][7][8]

When playing an audio CD, a motor within the CD player spins the disc to a scanning velocity of 1.2–1.4 m/s (constant linear velocity, CLV)—equivalent to approximately 500 RPM at the inside of the disc, and approximately 200 RPM at the outside edge. The track on the CD begins at the inside and spirals outward so a disc played from beginning to end slows its rotation rate during playback.
Comparison of various optical storage media

The program area is 86.05 cm2 and the length of the recordable spiral is 86.05 cm2 / 1.6 µm = 5.38 km. With a scanning speed of 1.2 m/s, the playing time is 74 minutes or 650 MiB of data on a CD-ROM. A disc with data packed slightly more densely is tolerated by most players (though some old ones fail). Using a linear velocity of 1.2 m/s and a narrower track pitch of 1.5 µm increases the playing time to 80 minutes, and data capacity to 700 MiB.
This is a photomicrograph of the pits at the inner edge of a CD-ROM; 2-second exposure under visible fluorescent light.
The pits in a CD are 500 nm wide, between 830 nm and 3,000 nm long and 150 nm deep.

A CD is read by focusing a 780 nm wavelength (near infrared) semiconductor laser through the bottom of the polycarbonate layer. The change in height between pits and lands results in a difference in the way the light is reflected. Because the pits are indented into the top layer of the disc and are read through the transparent polycarbonate base, the pits form bumps when read.[9] The laser hits the disc, casting a circle of light wider than the modulated spiral track reflecting partially from the lands and partially from the top of any bumps where they are present. As the laser passes over a pit (bump), its height means that the part of the light reflected from its peak is 1/2 wavelength out of phase with the light reflected from the land around it. This causes partial cancellation of the laser's reflection from the surface. By measuring the reflected intensity change with a photodiode, a modulated signal is read back from the disc.

To accommodate the spiral pattern of data, the laser is placed on a mobile mechanism within the disc tray of any CD player. This mechanism typically takes the form of a sled that moves along a rail. The sled can be driven by a worm gear or linear motor. Where a worm gear is used, a second shorter-throw linear motor, in the form of a coil and magnet, makes fine position adjustments to track eccentricities in the disk at high speed. Some CD drives (particularly those manufactured by Philips during the 1980s and early 1990s) use a swing arm similar to that seen on a gramophone. This mechanism allows the laser to read information from the center to the edge of a disc without having to interrupt the spinning of the disc itself.[further explanation needed]
Philips CDM210 CD Drive

The pits and lands do not directly represent the 0s and 1s of binary data. Instead, non-return-to-zero, inverted encoding is used: a change from either pit to land or land to pit indicates a 1, while no change indicates a series of 0s. There must be at least two, and no more than ten 0s between each 1, which is defined by the length of the pit. This, in turn, is decoded by reversing the eight-to-fourteen modulation used in mastering the disc, and then reversing the cross-interleaved Reed–Solomon coding, finally revealing the raw data stored on the disc. These encoding techniques (defined in the Red Book) were originally designed for CD Digital Audio, but they later became a standard for almost all CD formats (such as CD-ROM)....Disc shapes and diameters
Comparison of several forms of disk storage showing tracks (not to scale); green denotes start and red denotes end.
* Some CD-R(W) and DVD-R(W)/DVD+R(W) recorders operate in ZCLV, CAA or CAV modes.

The digital data on a CD begins at the center of the disc and proceeds toward the edge, which allows adaptation to the different sizes available. Standard CDs are available in two sizes. By far, the most common is 120 millimetres (4.7 in) in diameter, with a 74- or 80-minute audio capacity and a 650 or 700 MiB (737,280,000-byte) data capacity. Discs are 1.2 millimetres (0.047 in) thick, with a 15 millimetres (0.59 in) center hole. The size of the hole was chosen by Joop Sinjou and based on a Dutch 10-cent coin: a dubbeltje.[15] Philips/Sony patented the physical dimensions.[16]

The official Philips history says the capacity was specified by Sony executive Norio Ohga to be able to contain the entirety of Beethoven's Ninth Symphony on one disc.[17]
Kees Schouhamer Immink received a personal technical Emmy award for his contributions to the coding technologies of the Compact Disc, DVD, and Blu-ray disc.

This is a myth according to Kees Immink, as the EFM code format had not yet been decided in December 1979, when the 120 mm size was adopted. The adoption of EFM in June 1980 allowed 30 percent more playing time that would have resulted in 97 minutes for 120 mm diameter or 74 minutes for a disc as small as 100 millimetres (3.9 in). Instead, however, the information density was lowered by 30 percent to keep the playing time at 74 minutes.[18][19][20] The 120 mm diameter has been adopted by subsequent formats, including Super Audio CD, DVD, HD DVD, and Blu-ray Disc. The 80-millimetre (3.1 in) diameter discs ("Mini CDs") can hold up to 24 minutes of music or 210 MiB.
Physical size     Audio capacity     CD-ROM data capacity     Definition
120 mm     74–80 min     650–700 MB     Standard size
80 mm     21–24 min     185–210 MB     Mini-CD size
80×54 mm – 80×64 mm     ~6 min     10–65 MB     "Business card" size
Logical format
   
This section does not cite any sources. Please help improve this section by adding citations to reliable sources. Unsourced material may be challenged and removed. (May 2018) (Learn how and when to remove this template message)
Audio CD
Main article: Compact Disc Digital Audio

The logical format of an audio CD (officially Compact Disc Digital Audio or CD-DA) is described in a document produced in 1980 by the format's joint creators, Sony and Philips.[21] The document is known colloquially as the Red Book CD-DA after the color of its cover. The format is a two-channel 16-bit PCM encoding at a 44.1 kHz sampling rate per channel. Four-channel sound was to be an allowable option within the Red Book format, but has never been implemented. Monaural audio has no existing standard on a Red Book CD; thus, the mono source material is usually presented as two identical channels in a standard Red Book stereo track (i.e., mirrored mono); an MP3 CD, however, can have audio file formats with mono sound.

CD-Text is an extension of the Red Book specification for an audio CD that allows for the storage of additional text information (e.g., album name, song name, artist) on a standards-compliant audio CD. The information is stored either in the lead-in area of the CD, where there are roughly five kilobytes of space available or in the subcode channels R to W on the disc, which can store about 31 megabytes.

Compact Disc + Graphics is a special audio compact disc that contains graphics data in addition to the audio data on the disc. The disc can be played on a regular audio CD player, but when played on a special CD+G player, it can output a graphics signal (typically, the CD+G player is hooked up to a television set or a computer monitor); these graphics are almost exclusively used to display lyrics on a television set for karaoke performers to sing along with. The CD+G format takes advantage of the channels R through W. These six bits store the graphics information.

CD + Extended Graphics (CD+EG, also known as CD+XG) is an improved variant of the Compact Disc + Graphics (CD+G) format. Like CD+G, CD+EG uses basic CD-ROM features to display text and video information in addition to the music being played. This extra data is stored in subcode channels R-W. Very few, if any, CD+EG discs have been published.
Super Audio CD
Main article: Super Audio CD

Super Audio CD (SACD) is a high-resolution, read-only optical audio disc format that was designed to provide higher-fidelity digital audio reproduction than the Red Book. Introduced in 1999, it was developed by Sony and Philips, the same companies that created the Red Book. SACD was in a format war with DVD-Audio, but neither has replaced audio CDs. The SACD standard is referred to as the Scarlet Book standard.

Titles in the SACD format can be issued as hybrid discs; these discs contain the SACD audio stream as well as a standard audio CD layer which is playable in standard CD players, thus making them backward compatible.
CD-MIDI

CD-MIDI is a format used to store music-performance data, which upon playback is performed by electronic instruments that synthesize the audio. Hence, unlike the original Red Book CD-DA, these recordings are not digitally sampled audio recordings. The CD-MIDI format is defined as an extension of the original Red Book.
CD-ROM
Main article: CD-ROM

For the first few years of its existence, the CD was a medium used purely for audio. However, in 1988, the Yellow Book CD-ROM standard was established by Sony and Philips, which defined a non-volatile optical data computer data storage medium using the same physical format as audio compact discs, readable by a computer with a CD-ROM drive.
Video CD
Main article: Video CD

Video CD (VCD, View CD, and Compact Disc digital video) is a standard digital format for storing video media on a CD. VCDs are playable in dedicated VCD players, most modern DVD-Video players, personal computers, and some video game consoles. The VCD standard was created in 1993 by Sony, Philips, Matsushita, and JVC and is referred to as the White Book standard.

Overall picture quality is intended to be comparable to VHS video. Poorly compressed VCD video can sometimes be of lower quality than VHS video, but VCD exhibits block artifacts rather than analog noise and does not deteriorate further with each use. 352×240 (or SIF) resolution was chosen because it is half the vertical and half the horizontal resolution of the NTSC video. 352×288 is a similarly one-quarter PAL/SECAM resolution. This approximates the (overall) resolution of an analog VHS tape, which, although it has double the number of (vertical) scan lines, has a much lower horizontal resolution.
Super Video CD
Main article: Super Video CD

Super Video CD (Super Video Compact Disc or SVCD) is a format used for storing video media on standard compact discs. SVCD was intended as a successor to VCD and an alternative to DVD-Video and falls somewhere between both in terms of technical capability and picture quality.

SVCD has two-thirds the resolution of DVD, and over 2.7 times the resolution of VCD. One CD-R disc can hold up to 60 minutes of standard-quality SVCD-format video. While no specific limit on SVCD video length is mandated by the specification, one must lower the video bit rate, and therefore quality, to accommodate very long videos. It is usually difficult to fit much more than 100 minutes of video onto one SVCD without incurring a significant quality loss, and many hardware players are unable to play a video with an instantaneous bit rate lower than 300 to 600 kilobits per second.
Photo CD
Main article: Photo CD

Photo CD is a system designed by Kodak for digitizing and storing photos on a CD. Launched in 1992, the discs were designed to hold nearly 100 high-quality images, scanned prints, and slides using special proprietary encoding. Photo CDs are defined in the Beige Book and conform to the CD-ROM XA and CD-i Bridge specifications as well. They are intended to play on CD-i players, Photo CD players, and any computer with suitable software (irrespective of operating system). The images can also be printed out on photographic paper with a special Kodak machine. This format is not to be confused with Kodak Picture CD, which is a consumer product in CD-ROM format.
CD-i
Main article: Philips CD-i

The Philips Green Book specifies a standard for interactive multimedia compact discs designed for CD-i players (1993). CD-i discs can contain audio tracks that can be played on regular CD players, but CD-i discs are not compatible with most CD-ROM drives and software. The CD-i Ready specification was later created to improve compatibility with audio CD players, and the CD-i Bridge specification was added to create CD-i-compatible discs that can be accessed by regular CD-ROM drives.
CD-i Ready
Main article: CD-i Ready

Philips defined a format similar to CD-i called CD-i Ready, which puts CD-i software and data into the pregap of track 1. This format was supposed to be more compatible with older audio CD players.
Enhanced Music CD (CD+)
Main article: Blue Book (CD standard)

Enhanced Music CD, also known as CD Extra or CD Plus, is a format that combines audio tracks and data tracks on the same disc by putting audio tracks in a first session and data in a second session. It was developed by Philips and Sony, and it is defined in the Blue Book.
VinylDisc
Main article: VinylDisc

VinylDisc is the hybrid of a standard audio CD and the vinyl record. The vinyl layer on the disc's label side can hold approximately three minutes of music....Manufacture
Main article: Compact Disc manufacturing
Individual pits are visible on the micrometer scale.

In 1995, material costs were 30 cents for the jewel case and 10 to 15 cents for the CD. The wholesale cost of CDs was $0.75 to $1.15, while the typical retail price of a prerecorded music CD was $16.98.[22] On average, the store received 35 percent of the retail price, the record company 27 percent, the artist 16 percent, the manufacturer 13 percent, and the distributor 9 percent.[22] When 8-track cartridges, compact cassettes, and CDs were introduced, each was marketed at a higher price than the format they succeeded, even though the cost to produce the media was reduced. This was done because the perceived value increased. This continued from phonograph records to CDs, but was broken when Apple marketed MP3s for $0.99, and albums for $9.99. The incremental cost, though, to produce an MP3 is negligible.[23]
Writable compact discs
Recordable CD
700 MiB CD-R next to a mechanical pencil for scale
Main article: CD-R

Recordable Compact Discs, CD-Rs, are injection-molded with a "blank" data spiral. A photosensitive dye is then applied, after which the discs are metalized and lacquer-coated. The write laser of the CD recorder changes the color of the dye to allow the read laser of a standard CD player to see the data, just as it would with a standard stamped disc. The resulting discs can be read by most CD-ROM drives and played in most audio CD players. CD-Rs follow the Orange Book standard.

CD-R recordings are designed to be permanent. Over time, the dye's physical characteristics may change causing read errors and data loss until the reading device cannot recover with error correction methods. Errors can be predicted using surface error scanning. The design life is from 20 to 100 years, depending on the quality of the discs, the quality of the writing drive, and storage conditions.[24] However, testing has demonstrated such degradation of some discs in as little as 18 months under normal storage conditions.[25][26] This failure is known as disc rot, for which there are several, mostly environmental, reasons.[27]

The recordable audio CD is designed to be used in a consumer audio CD recorder. These consumer audio CD recorders use SCMS (Serial Copy Management System), an early form of digital rights management (DRM), to conform to the AHRA (Audio Home Recording Act). The Recordable Audio CD is typically somewhat more expensive than CD-R due to lower production volume and a 3 percent AHRA royalty used to compensate the music industry for the making of a copy.[28]

High-capacity recordable CD is a higher-density recording format that can hold 20% more data than conventional discs.[29] The higher capacity is incompatible with some recorders and recording software.[30]
ReWritable CD
Main article: CD-RW

CD-RW is a re-recordable medium that uses a metallic alloy instead of a dye. The write laser, in this case, is used to heat and alter the properties (amorphous vs. crystalline) of the alloy, and hence change its reflectivity. A CD-RW does not have as great a difference in reflectivity as a pressed CD or a CD-R, and so many earlier CD audio players cannot read CD-RW discs, although most later CD audio players and stand-alone DVD players can. CD-RWs follow the Orange Book standard.

The ReWritable Audio CD is designed to be used in a consumer audio CD recorder, which will not (without modification) accept standard CD-RW discs. These consumer audio CD recorders use the Serial Copy Management System (SCMS), an early form of digital rights management (DRM), to conform to the United States' Audio Home Recording Act (AHRA). The ReWritable Audio CD is typically somewhat more expensive than CD-R due to (a) lower volume and (b) a 3 percent AHRA royalty used to compensate the music industry for the making of a copy." (wikipedia.org)

"A CD-ROM (/ˌsiːdiːˈrɒm/, compact disc read-only memory) is a type of read-only memory consisting of a pre-pressed optical compact disc that contains data. Computers can read—but not write or erase—CD-ROMs. Some CDs, called enhanced CDs, hold both computer data and audio with the latter capable of being played on a CD player, while data (such as software or digital video) is only usable on a computer (such as ISO 9660[2] format PC CD-ROMs).

During the 1990s and early 2000s, CD-ROMs were popularly used to distribute software and data for computers and fifth generation video game consoles. DVD started to replace it in these roles starting in the early 2000s.
History

The earliest theoretical work on optical disc storage was done by independent researchers in the United States including David Paul Gregg (1958) and James Russel (1965–1975). In particular, Gregg's patents were used as the basis of the LaserDisc specification that was co-developed between MCA and Philips after MCA purchased Gregg's patents, as well as the company he founded, Gauss Electrophysics.[3] The LaserDisc was the immediate precursor to the CD, with the primary difference being that the LaserDisc encoded information through an analog process whereas the CD used digital encoding.

The physical dimensions are based as follows: Outer diametre: same as a Heineken beer mat. Inner hole: same as the outer diametre of a Dutch dime (ten cents of a Guilder). And that all because Philips is a world famous Dutch company, which had a leading role in the invention.

Key work to digitize the optical disc was performed by Toshi Doi and Kees Schouhamer Immink during 1979–1980, who worked on a taskforce for Sony and Phillips.[4] The result was the Compact Disc Digital Audio (CD-DA), defined on 1980. The CD-ROM was later designed as an extension of the CD-DA, and adapted this format to hold any form of digital data, with an initial storage capacity of 553 MB.[5] Sony and Philips created the technical standard that defines the format of a CD-ROM in 1983,[6] in what came to be called the Yellow Book. The CD-ROM was announced in 1984[7] and introduced by Denon and Sony at the first Japanese COMDEX computer show in 1985.[8] In November, 1985, several computer industry participants including Microsoft, Philips, Sony, Apple and Digital Equipment Corporation met to create a specification to define a file system format for CD-ROMs.[9] The resulting specification, called the High Sierra format, was published in May 1986.[9] It was eventually standardized, with a few changes, as the ISO 9660 standard in 1988. One of the first products to be made available to the public on CD-ROM was the Grolier Academic Encyclopedia, presented at the Microsoft CD-ROM Conference in March 1986.[9]

CD-ROMs began being used in home video game consoles starting with the PC Engine CD-ROM² (TurboGrafx-CD) in 1988, while CD-ROM drives had also become available for home computers by the end of the 1980s. In 1990, Data East demonstrated an arcade system board that supported CD-ROMs, similar to 1980s laserdisc video games but with digital data, allowing more flexibility than older laserdisc games.[10] By early 1990, about 300,000 CD-ROM drives were sold in Japan, while 125,000 CD-ROM discs were being produced monthly in the United States.[11] Some computers which were marketed in the 1990s were called "multimedia" computers because they incorporated a CD-ROM drive, which allowed for the delivery of several hundred megabytes of video, picture, and audio data.
CD-ROM discs
Media
A CD-ROM in the tray of a partially open CD-ROM drive.

CD-ROMs are identical in appearance to audio CDs, and data are stored and retrieved in a very similar manner (only differing from audio CDs in the standards used to store the data). Discs are made from a 1.2 mm thick disc of polycarbonate plastic, with a thin layer of aluminium to make a reflective surface. The most common size of CD-ROM is 120 mm in diameter, though the smaller Mini CD standard with an 80 mm diameter, as well as shaped compact discs in numerous non-standard sizes and molds (e.g., business card-sized media), also exist.

Data is stored on the disc as a series of microscopic indentations called "pits", with the non-indented spaces between them called "lands". A laser is shone onto the reflective surface of the disc to read the pattern of pits and lands. Because the depth of the pits is approximately one-quarter to one-sixth of the wavelength of the laser light used to read the disc, the reflected beam's phase is shifted in relation to the incoming beam, causing destructive interference and reducing the reflected beam's intensity. This is converted into binary data.
Standard

Several formats are used for data stored on compact discs, known as the Rainbow Books. The Yellow Book, created in 1983,[6][12] defines the specifications for CD-ROMs, standardized in 1988 as the ISO/IEC 10149[1] standard and in 1989 as the ECMA-130[13] standard. The CD-ROM standard builds on top of the original Red Book CD-DA standard for CD audio. Other standards, such as the White Book for Video CDs, further define formats based on the CD-ROM specifications. The Yellow Book itself is not freely available, but the standards with the corresponding content can be downloaded for free from ISO or ECMA.

There are several standards that define how to structure data files on a CD-ROM. ISO 9660 defines the standard file system for a CD-ROM. ISO 13490 is an improvement on this standard which adds support for non-sequential write-once and re-writeable discs such as CD-R and CD-RW, as well as multiple sessions. The ISO 13346 standard was designed to address most of the shortcomings of ISO 9660,[14] and a subset of it evolved into the UDF format, which was adopted for DVDs. A bootable CD specification, called El Torito, was issued in January 1995, to make a CD emulate a hard disk or floppy disk.
Manufacture
Main article: Compact Disc manufacturing

Pre-pressed CD-ROMs are mass-produced by a process of stamping where a glass master disc is created and used to make "stampers", which are in turn used to manufacture multiple copies of the final disc with the pits already present. Recordable (CD-R) and rewritable (CD-RW) discs are manufactured by a different method, whereby the data are recorded on them by a laser changing the properties of a dye or phase transition material in a process that is often referred to as "burning".
CD-ROM format

Data stored on CD-ROMs follows the standard CD data encoding techniques described in the Red Book specification (originally defined for audio CD only). This includes cross-interleaved Reed–Solomon coding (CIRC), eight-to-fourteen modulation (EFM), and the use of pits and lands for coding the bits into the physical surface of the CD.

The structures used to group data on a CD-ROM are also derived from the Red Book. Like audio CDs (CD-DA), a CD-ROM sector contains 2,352 bytes of user data, composed of 98 frames, each consisting of 33 bytes (24 bytes for the user data, 8 bytes for error correction, and 1 byte for the subcode). Unlike audio CDs, the data stored in these sectors corresponds to any type of digital data, not audio samples encoded according to the audio CD specification. To structure, address and protect this data, the CD-ROM standard further defines two sector modes, Mode 1 and Mode 2, which describe two different layouts for the data inside a sector.[2] A track (a group of sectors) inside a CD-ROM only contains sectors in the same mode, but if multiple tracks are present in a CD-ROM, each track can have its sectors in a different mode from the rest of the tracks. They can also coexist with audio CD tracks, which is the case of mixed mode CDs." (wikipedia.org)

"The Dreamcast[a] is a home video game console released by Sega on November 27, 1998, in Japan; September 9, 1999, in North America; and October 14, 1999, in Europe. It was the first sixth-generation video game console, preceding Sony's PlayStation 2, Nintendo's GameCube, and Microsoft's Xbox. The Dreamcast was Sega's final console; its 2001 discontinuation ended the company's eighteen years in the console market.

The Dreamcast was developed by an internal Sega team led by Hideki Sato. In contrast to the expensive hardware of the unsuccessful Saturn, the Dreamcast was designed to reduce costs with off-the-shelf components, including a Hitachi SH-4 CPU and an NEC PowerVR2 GPU. Sega used the GD-ROM media format to avoid the expenses of DVD-ROM technology and an optional, custom version of the Windows CE operating system to make porting PC games easy. The Dreamcast was the first console to include a built-in modular modem for internet access and online play.

The Dreamcast was released to a subdued reception in Japan, but had a successful US launch backed by a large marketing campaign. However, interest steadily declined as Sony built anticipation for the PlayStation 2. Dreamcast sales did not meet Sega's expectations after several price cuts, and the company suffered significant financial losses. After a change in leadership, Sega discontinued the Dreamcast on March 31, 2001, withdrew from the console business, and restructured itself as a third-party developer. 9.13 million Dreamcast units were sold worldwide. Its bestselling game, Sonic Adventure (1998)—the first 3D game in Sega's Sonic the Hedgehog franchise—sold 2.5 million copies.

Despite its short lifespan and limited third-party support, reviewers have celebrated the Dreamcast as one of the greatest consoles. It is considered ahead of its time for pioneering concepts such as online play and downloadable content. Many Dreamcast games are regarded as innovative, including Sonic Adventure, Crazy Taxi (1999), Shenmue (1999), Jet Set Radio (2000), Phantasy Star Online (2000), and high-quality ports from Sega's NAOMI arcade system board.
History
Background

In 1988, Sega released the Genesis (known as the Mega Drive in most countries outside North America), in the fourth generation of video game consoles.[1] It became the most successful Sega console ever, at 30.75 million units sold.[2] Its successor, the Saturn, was released in Japan in 1994.[3] The Saturn is CD-ROM-based and has 2D and 3D graphics, but its complex dual-CPU architecture was more difficult to program than its chief competitor, the Sony PlayStation.[4] Although the Saturn debuted before the PlayStation in Japan and the United States,[5][6] its surprise US launch, four months earlier than scheduled,[7][8][9] was marred by a lack of distribution, which remained a problem.[10] Losses on the Saturn[11] contributed to financial problems for Sega, whose revenue had declined between 1992 and 1995 as part of an industry-wide slowdown.[5][12][13]

Sega announced that Shoichiro Irimajiri would replace Tom Kalinske as chairman and CEO of Sega of America,[14][15][16] while Bernie Stolar, a former executive at Sony Computer Entertainment of America,[17][18] became Sega of America's executive vice president in charge of product development and third-party relations.[15][16] After the 1996 launch of the Nintendo 64, sales of the Saturn and its software fell sharply. As of August 1997, Sony controlled 47 percent of the console market, Nintendo controlled 40 percent, and Sega controlled only 12 percent; neither price cuts nor high-profile games helped the Saturn.[18]

    I thought the Saturn was a mistake as far as hardware was concerned. The games were obviously terrific, but the hardware just wasn't there.

—Bernie Stolar, former president of Sega of America, in 2009[19]

As a result of Sega's deteriorating financial situation, Hayao Nakayama resigned as president of Sega in January 1998 in favor of Irimajiri,[20] and Stolar acceded to become CEO and president of Sega of America.[18][21] Following five years of generally declining profits,[22] in the fiscal year ending March 31, 1998, Sega suffered its first parent and consolidated financial losses since its 1988 listing on the Tokyo Stock Exchange,[23] reporting a consolidated net loss of ¥35.6 billion (US$269.8 million).[22] Shortly before announcing its financial losses, Sega announced the discontinuation of the Saturn in North America to prepare for the launch of its successor.[18][20] This effectively left the Western market without Sega games for more than a year.[4] Rumors about the upcoming Dreamcast—spread mainly by Sega—leaked to the public before the last Saturn games were released.[24]
Development

As early as 1995, reports surfaced that Sega would collaborate with Lockheed Martin, The 3DO Company, Matsushita, or Alliance Semiconductor to create a new graphics processing unit, which conflicting accounts said would be used for a 64-bit "Saturn 2" or an add-on peripheral.[25][26][27] Dreamcast development was unrelated.[26][28] Considering the Saturn's poor performance, Irimajiri looked beyond Sega's internal hardware development division to create a new console.[28] In 1997, he enlisted IBM's Tatsuo Yamamoto to lead an 11-person team to work on a secret project in the United States with the codename Blackbelt. Accounts vary on how an internal team led by Hideki Sato also began development on Dreamcast hardware; one account specifies that Sega tasked both teams,[29] and another suggests that Sato was bothered by Irimajiri's choice to begin development externally and had his team start work.[28][30] Sato and his group chose the Hitachi SH-4 processor architecture and the VideoLogic PowerVR2 graphics processor, manufactured by NEC, in the production of the mainboard. Initially known as Whitebelt,[28] the project was later codenamed Dural, after the metallic female fighter from Sega's Virtua Fighter series.[24][29]

Yamamoto's group opted to use 3dfx Voodoo 2 and Voodoo Banshee graphics processors alongside a Motorola PowerPC 603e central processing unit (CPU),[28] but Sega management later asked them to also use the SH-4 chip.[29] Both processors have been described as "off-the-shelf" components.[28] According to Charles Bellfield, the former Sega of America vice president of communications and former NEC brand manager, presentations of games using the NEC solution showcased the performance and low cost delivered by the SH-4 and PowerVR architecture. He said that Sega's relationship with NEC, a Japanese company, likely also influenced the decision to use its hardware rather than the architecture developed in America.[29] Stolar felt the US 3dfx version should have been used, but that "Japan wanted the Japanese version, and Japan won".[29] As a result, 3dfx filed a lawsuit against Sega and NEC claiming breach of contract, which was settled out of court.[28]

The choice to use the PowerVR architecture concerned Electronic Arts (EA), a longtime developer for Sega consoles. EA had invested in 3dfx but was unfamiliar with the selected architecture, which was reportedly less powerful.[29] According to Shiro Hagiwara (a general manager at Sega's hardware division) and Ian Oliver (the managing director of the Sega subsidiary Cross Products), the SH-4 was chosen while still in development, following lengthy deliberation, as the only processor that "could adapt to deliver the 3D geometry calculation performance necessary".[31] By February 1998, Sega had renamed the project Katana, after the Japanese sword,[24] although certain hardware specifications such as random access memory (RAM) were not finalized.[32]

Knowing the Saturn had been set back by its high production costs and complex hardware, Sega took a different approach with the Dreamcast. Like previous Sega consoles, the Dreamcast was designed around intelligent subsystems working in parallel,[31] but the selections of hardware were closer to personal computers than video game consoles, reducing cost.[28] It also enabled software development to begin before any development kits had been completed, as Sega informed developers that any game developed with a Pentium II 200 in mind would run on the console.[33] According to Damien McFerran, "the motherboard was a masterpiece of clean, uncluttered design and compatibility".[28]

The Chinese economist and future Sega.com CEO Brad Huang convinced the Sega chairman, Isao Okawa, to include a modem with every Dreamcast under opposition from Okawa's staff over the additional US$15 cost per unit.[34][35][36] To account for rapid changes in home data delivery, Sega designed the modem to be modular.[31]

Sega selected the GD-ROM media format.[37] Jointly developed by Sega and Yamaha, the GD-ROM could be mass-produced at a similar price to a normal CD-ROM,[31] avoiding the greater expense of newer DVD-ROM technology.[28][38][39]

Microsoft developed a custom Dreamcast version of Windows CE with DirectX API and dynamic-link libraries, making it easy to port PC games to the platform,[31] although programmers would ultimately favor Sega's development tools over those from Microsoft.[28] A member of the Project Katana team speaking anonymously predicted this would be the case, speculating developers would prefer the greater performance possibilities offered by the Sega OS to the more user-friendly interface of the Microsoft OS.[32] In late 1997 there were reports about the rumored system, then codenamed Dural, and that it had been demonstrated to a number of game developers.[40]

The Dreamcast was finally revealed on May 21, 1998 in Tokyo.[41] Sega held a public competition to name its new system and considered over 5,000 different entries before choosing "Dreamcast"—a portmanteau of "dream" and "broadcast".[28] According to Katsutoshi Eguchi, the Japanese game developer Kenji Eno submitted the name and created the Dreamcast's spiral logo, but has not been officially credited by Sega.[42] The Dreamcast's startup sound was composed by the Japanese musician Ryuichi Sakamoto.[43] Because the Saturn had tarnished its reputation, Sega planned to remove its name from the console and establish a new gaming brand similar to Sony's PlayStation, but Irimajiri's management team decided to retain it.[28] Sega spent US$50–80 million on hardware development, $150–200 million on software development, and US$300 million on worldwide promotion—a sum which Irimajiri, a former Honda executive, humorously likened to the investments required to design new automobiles.[28][44]
Launch
Japan

Despite a 75 percent drop in half-year profits just before the Japanese launch, Sega was confident about the Dreamcast. It drew significant interest and many pre-orders.[28] However, Sega could not achieve its shipping goals for the Japanese Dreamcast launch due to a shortage of PowerVR chipsets caused by a high failure rate in the manufacturing process.[28][45] As more than half of its limited stock had been pre-ordered, Sega stopped pre-orders in Japan. On November 27, 1998, the Dreamcast launched in Japan at a price of ¥29,000, and the stock sold out by the end of the day. However, of the four games available at launch, only one—a port of Virtua Fighter 3, the most successful arcade game Sega ever released in Japan—sold well.[46] Sega estimated that an additional 200,000–300,000 Dreamcast units could have been sold with sufficient supply.[46]

Sega had announced that Sonic Adventure, the next game starring its mascot, Sonic the Hedgehog, would launch with the Dreamcast and promoted it with a large-scale public demonstration at the Tokyo Kokusai Forum Hall,[47][48][49] but it and Sega Rally Championship 2 were delayed.[28] They arrived within the following weeks, but sales continued to be slower than expected.[50] Irimajiri hoped to sell over 1 million Dreamcast units in Japan by February 1999, but sold fewer than 900,000, undermining Sega's attempts to build an installed base sufficient to protect the Dreamcast after the arrival of competition from other manufacturers.[51] There were reports of disappointed Japanese consumers returning their Dreamcasts and using the refund to purchase additional PlayStation software.[52] Seaman, released in July 1999, became the Dreamcast's first major hit in Japan.[4][34][53] Prior to the Western launch, Sega reduced the price of the Dreamcast to ¥19,900, effectively making it unprofitable but increasing sales. The reduction and the release of Namco's Soulcalibur helped Sega gain 17 percent on its shares.[28]
North America

Before the Dreamcast's release, Sega was dealt a blow when EA, the largest third-party video game publisher, announced it would not develop games for it. EA's chief creative officer Bing Gordon said that Sega had "flip-flopped" on the hardware configuration, that EA developers did not want to work on it, and that Sega "was not acting like a competent hardware company". Gordon also said that Sega could not afford to give them the "kind of license that EA has had over the last five years".[29] According to Stolar, the EA president, Larry Probst, wanted exclusive rights as the only sports brand on Dreamcast, which Stolar could not accept due to Sega's recent US$10 million purchase of the sports game developer Visual Concepts. While EA's Madden NFL series had established brand power, Stolar regarded Visual Concepts' NFL 2K as superior and would provide "a breakthrough experience" to launch the Dreamcast.[19][29] While the Dreamcast would have none of EA's popular sports games, "Sega Sports" games developed mainly by Visual Concepts[54] helped to fill that void.[29]

    Let's take the conservative estimate of 250,000 Dreamcast units at presage—that's a quarter of a million units at $200. We'll have a ratio of 1.5 or two games for every Dreamcast unit sold. That's half a million units of software. We think we'll be .5 to one on VMUs and peripheral items such as extra controllers and what have you. This could be a $60 to 80 million 24-hour period. What has ever sold $60 to 80 million in the first 24 hours?

—Peter Moore, speaking to Electronic Gaming Monthly about the upcoming launch of the Dreamcast.[55]

Working closely with Midway Games (which developed four launch games for the system) and taking advantage of the ten months following the Dreamcast's release in Japan, Sega of America worked to ensure a more successful US launch with a minimum of 15 launch games.[56] With lingering bitterness over the Saturn's early release, Stolar repaired relations with major US retailers, with whom Sega presold 300,000 Dreamcast units.[29] In addition, a pre-launch promotion enabled consumers to rent Dreamcasts from Hollywood Video in the months preceding its September launch.[57] Sega of America's senior vice president of marketing Peter Moore,[58] a fan of the attitude previously associated with Sega's brand, worked with Foote, Cone & Belding and Access Communications to develop the "It's Thinking" campaign of 15-second television commercials, which emphasized the Dreamcast's hardware power.[29][57][59] According to Moore, "We needed to create something that would really intrigue consumers, somewhat apologize for the past, but invoke [sic] all the things we loved about Sega, primarily from the Genesis days."[29] On August 11, Sega of America confirmed that Stolar had been fired, leaving Moore to direct the launch.[56][60]

The Dreamcast launched in North America on September 9, 1999, at a price of $199, which Sega's marketing dubbed "9/9/99 for $199".[4][51][57] Eighteen launch games were available in the US[57][61][62] Sega set a new sales record by selling more than 225,132 Dreamcast units in 24 hours, earning $98.4 million in what Moore called "the biggest 24 hours in entertainment retail history".[29] Within two weeks, US Dreamcast sales exceeded 500,000.[29] By Christmas, Sega held 31 percent of the North American video game market share.[63] Significant launch games included Sonic Adventure, the arcade fighting game Soulcalibur, and Visual Concepts' football simulation NFL 2K.[29][58] On November 4, Sega announced it had sold over one million Dreamcast units.[64] The launch was marred by a glitch at one of Sega's manufacturing plants, which produced defective GD-ROMs.[65]
Europe

Sega released the Dreamcast in Europe on October 14, 1999,[64] at a price of £200.[28] By November 24, 400,000 consoles had been sold in Europe.[64] By Christmas of 1999, Sega of Europe had sold 500,000 units, six months ahead of schedule.[28] The price was dropped to £149.99 from September 8, 2000, with sales at around 800,000 in Europe at this point.[66] Announcing the drop, Jean-François Cecillon, CEO of Sega Europe, commented that "There are 'X' amount of core gamers in Europe; the early adopters. We have reached 80 or 90 per cent of them now and the market is screaming for a price reduction. We have to acknowledge these things and go with the market".[67] Sales did not continue at this pace, and by October 2000, Sega had sold only about 1 million units in Europe.[68] As part of Sega's promotions of the Dreamcast in Europe, it sponsored four European football clubs: Arsenal F.C. (England),[69] AS Saint-Étienne (France),[70] U.C. Sampdoria (Italy),[71] and Deportivo de La Coruña (Spain).[66]
Australia and New Zealand

Through the regional distributor Ozisoft, the Dreamcast went on sale in Australia and New Zealand on November 30, 1999, at a price of A$499.[72] The launch was planned for September, but was delayed due to problems with Internet compatibility and launch game availability, then delayed again from the revised date of October 25 for various reasons.[73][74][b] There were severe problems at launch; besides a severe shortage of the consoles, only six of the thirty planned launch games were available for purchase on day one with no first-party software included, and additional peripherals were not available in stores.[77]

The Ozisoft representative Steve O'Leary, in a statement released the day of launch, explained that the Australian Customs Service had impounded virtually all the supplied launch software, including demo discs, due to insufficient labeling of their country of origin; Ozisoft had received them only two days before launch, resulting in few games that were catalogued and prepared for shipment in time. O'Leary also said that the Dreamcast's high demand in other markets had reduced the number of peripherals allotted to the region.[78] Further complicating matters was the lack of an internet disc due to localization problems, and delays in securing an ISP contract, which was done through Telstra the day before launch. The online component was not ready until March 2000, at which point Ozisoft sent the necessary software to users who had sent in a filled-out reply paid card included with the console.[79][80][81] The poor launch, combined with a lack of advertising and a high price point, produced lackluster sales in Australia; two large retail chains reported a combined total of 13 console sales over the first few days after launch....Software
Game library
See also: List of Dreamcast games

The Dreamcast library consists of over 600 games across all regions,[165] in GD-ROM format.[37] It uses regional lockout, only playing games released within its predetermined region; however, this is circumventable via modchip installation, boot discs, or cheat discs such as Datel's Action Replay.[166][167][168] In Japan, the Dreamcast was launched with Virtua Fighter 3tb, Pen Pen TriIcelon, Godzilla Generations, and July.[169] In North America, it launched with 19 games, including the highly anticipated Sonic Adventure, Soulcalibur, and NFL 2K.[c][170] In Europe, it was planned to launch with 10 games; this increased to 15 after the launch was delayed.[d][173] Licensed Dreamcast games were released until mid-2002 in the US.[19] Some indie developers continued to release games, such as 2007's Last Hope, developed by the German studio NG:Dev.Team.[132]
First-party games
Sonic Adventure is a significant Dreamcast game, as the first 3D platforming Sonic game.

In what has been called "a brief moment of remarkable creativity",[4] in 2000, Sega restructured its arcade and console development teams into nine semi-autonomous studios headed by their top designers.[19][57][174] Studios included United Game Artists (UGA), Hitmaker, Smilebit, Overworks, WOW Entertainment, Amusement Vision, Sega Rosso, Wave Master, and Sonic Team,[175] while Sega AM2 had been taken over earlier in the year by CSK Research Institute[176] and became independent in 2001 as SEGA-AM2 Co., Ltd.[177] Sega's design studios were encouraged to experiment and benefited from a relatively lax approval process.[160] This resulted in games such as UGA's Rez, an attempt to simulate synaesthesia in the form of a rail shooter;[178][179][180] Wow's The Typing of the Dead, a version of The House of the Dead 2 remade into a touch typing trainer;[181][182][183] and Hitmaker's Segagaga, a Japan-exclusive role-playing game in which players are tasked with preventing Sega from going out of business.[184]

Sonic Team's Sonic Adventure, the first fully 3D platform game starring Sega's mascot Sonic the Hedgehog, was considered the "centerpiece" of the Dreamcast launch.[4] At 2.5 million copies, it is the best-selling Dreamcast game.[39][185] Sonic Team also developed the Dreamcast's first online game—ChuChu Rocket!—which was praised for its addictive puzzle gameplay and "frantic" multiplayer matches,[186][187][188] and the critically successful music game Samba de Amigo, which was noted for its expensive maracas peripheral and colorful aesthetic.[189][190][191] Sonic Team's Phantasy Star Online, the first online console RPG, is considered a landmark game for refining and simplifing Diablo's style of gameplay to appeal to console audiences.[105][192][193]

UGA created the music game Space Channel 5 for a female casual audience;[194] players help a female outer-space news reporter, Ulala, fight aliens with "groove energy" by dancing.[54][195] Hitmaker's arcade ports include Crazy Taxi, an open-world arcade racing game known for its addictive gameplay with more than one million copies sold;[4][182] and Virtua Tennis, which revitalized the tennis game genre.[4][196][197] Smilebit's Jet Set Radio, in which players control a Tokyo gang of rebellious inline skaters, is cited as a major example of Sega's commitment to original concepts during the Dreamcast's lifespan.[198][199] Jet Set Radio also popularized cel shaded graphics,[4][200] though it failed to meet Sega's sales expectations.[199][201][202] The role-playing game Skies of Arcadia, developed by Overworks and produced by Rieko Kodama,[203] was acclaimed for its surreal Jules Verne-inspired fantasy world of floating islands and sky pirates, charming protagonists, exciting airship battles and memorable plot.[4][204][205]

AM2 developed what Sega hoped would be the Dreamcast's killer app, Shenmue, a "revenge epic in the tradition of Chinese cinema",[19][206] with a level of detail considered unprecedented for a video game.[207] Incorporating a simulated day-and-night cycle with variable weather, non-player characters with regular schedules, the ability to pick up and examine detailed objects, and introducing the quick-time event in its modern form,[207][208] Shenmue went over budget and was rumored to have cost Sega over $50 million.[209][207][210] According to Moore, Shenmue sold "extremely well", but had no chance of making a profit due to the Dreamcast's limited installed base.[211]

Visual Concepts' NFL 2K football series and its NBA 2K basketball series were critically acclaimed.[212] NFL 2K was considered an outstanding launch game for its high-quality visuals[58][213] and "insightful, context-friendly, and, yes, even funny commentary",[164] while NFL 2K1 featured groundbreaking online multiplayer earlier than its chief competitor, EA's Madden NFL series.[29][214][197] Madden and 2K continued to compete on other platforms through 2004, with the 2K series introducing innovations such as a first person perspective new to the genre,[215] and eventually launching ESPN NFL 2K5 at the aggressively low price point of $19.95 until EA signed an exclusive agreement with the National Football League, effectively putting every other pro-football game out of business.[216][217] After Sega sold Visual Concepts for $24 million in 2005, the NBA 2K series continued with publisher Take-Two Interactive.[186][218] During the Dreamcast's lifespan, Visual Concepts also collaborated with the Sonic the Hedgehog level designer Hirokazu Yasuhara on the action-adventure game Floigan Bros.[219] and developed the action game Ooga Booga.[220]
Ports and third-party games

Before the launch of the Dreamcast in Japan, Sega announced its NAOMI[221] arcade board, a cheaper alternative to the Sega Model 3.[222] NAOMI shares the same technology as the Dreamcast, with twice as much system, video, and audio memory and a 160 MB flash ROM board in place of a GD-ROM drive, allowing nearly identical home conversions of arcade games.[4][31] Games were ported from NAOMI to the Dreamcast by several leading Japanese arcade companies, including Capcom and Namco.[4] The Dreamcast also used parts similar to those found in personal computers with Pentium II and III processors, allowing a handful of ports of PC games.[223][224]

To appeal to the European market, Sega formed a French affiliate, No Cliché, which developed games such as Toy Commander.[4][225] Sega Europe also approached Bizarre Creations to develop the racing game Metropolis Street Racer.[226] Although Acclaim, SNK, Ubisoft, Midway, Activision, Infogrames, and Capcom supported the Dreamcast during its first year,[29] third-party support proved difficult to obtain due to the failure of the Sega Saturn and the profitability of publishing for the PlayStation.[28] Namco's Soulcalibur, for example, was released for the Dreamcast because of the relative unpopularity of the Soul series at the time; Namco's more successful Tekken franchise was associated with the PlayStation console and PlayStation-based arcade boards.[4] Capcom produced a number of fighting games for the Dreamcast, including the Power Stone series, and a temporarily exclusive[227] entry in the popular Resident Evil series, Resident Evil – Code: Veronica.[186][228][229] The Dreamcast is known for several shoot 'em ups, most notably Treasure's Bangai-O and Ikaruga.[4][227][230] Sega also revived franchises from the Genesis era, such as Appaloosa Interactive's Ecco the Dolphin....Reception and legacy
A Dreamcast European retail demo kiosk is at the Finnish Museum of Games in Tampere, Finland.

In December 1999, Next Generation rated the Dreamcast four out of five, writing: "If you want the most powerful system available now, showcasing the best graphics at a reasonable price, this system is for you." However, Next Generation gave its future prognosis three out of five, noting that Sony and Nintendo were both due to release more powerful consoles.[255] At the beginning of 2000, five Electronic Gaming Monthly reviewers scored the Dreamcast 8.5, 8.5, 8.5, 8.0, and 9.0 out of 10.[256] In 2001, the Electronic Gaming Monthly reviewers scored it 9.0, 9.0, 9.0, 9.0, and 9.5 out of 10.[257] BusinessWeek named the Dreamcast one of the best products of 1999.[258]

Reasons cited for the failure of the Dreamcast include consumer excitement for the PS2;[57][214][259] a lack of support from EA and Squaresoft, the most popular third parties in the US and Japan respectively;[144] disagreement among executives over Sega's future, and Okawa's lack of commitment to the product;[19] Sega's lack of advertising money, with Bellfield doubting that Sega spent even "half" the $100 million it had pledged to promote the Dreamcast in the US;[29][260] that the market was not ready for online gaming;[124][144] Sega's focus on "hardcore" gamers over mainstream consumers;[57][124] poor timing;[29] and damage to Sega's reputation caused by its several poorly supported previous platforms.[144][261][262] In GamePro, Blake Snow wrote of "the much beloved [Dreamcast] launched years ahead of the competition but ultimately struggled to shed the negative reputation [Sega] had gained during the Saturn, Sega 32X, and Sega CD days. As a result, casual gamers and jaded third-party developers doubted Sega's ability to deliver."[261]

Eurogamer's Dan Whitehead noted that consumers' "wait-and-see" approach, and the lack of support from EA, were symptoms rather the cause of Sega's decline. He concluded that "Sega's misadventures during the 1990s had left both gamers and publishers wary of any new platform bearing its name".[214] According to 1Up.com's Jeremy Parish, it would be intellectually dishonest to blame Sony for "killing the Dreamcast by overselling the PS2", as Sega's lack of support for previous consoles had made customers hesitant to purchase Dreamcasts.[57]

In 2009, IGN named the Dreamcast the eighth-greatest video game console, praising its software and innovations, including its online play.[39] In 2010, PC Magazine's Jeffrey L. Wilson named the Dreamcast the greatest console and said that it was "gone too soon".[263] In 2013, Edge named the Dreamcast the tenth-best console of the last 20 years, highlighting innovations including in-game voice chat, downloadable content, and second-screen technology through the use of VMUs. Edge wrote that "Sega's console was undoubtedly ahead of its time, and it suffered at retail for that reason... [b]ut its influence can still be felt today."[143] Dan Whitehead of Eurogamer likened the Dreamcast to "a small, square, white plastic JFK. A progressive force in some ways, perhaps misguided in others, but nevertheless a promising life cut tragically short by dark shadowy forces, spawning complex conspiracy theories that endure to this day." He wrote that its short lifespan "may have sealed its reputation as one of the greatest consoles ever", as "nothing builds a cult like a tragic demise".[214] According to IGN's Travis Fahs, "Many hardware manufacturers have come and gone, but it's unlikely any will go out with half as much class as Sega."[4]

    If ever a system deserved to succeed, it was Dreamcast. Dreamcast has a hell of a library. It's dying now, 18 months old, with a larger library than the 5-year-old Nintendo 64. It's a better library than the Nintendo 64. Dreamcast was a wonderful system.

—Journalist Steven L. Kent, March 2001.[264]

The Dreamcast's game library was celebrated.[261] In January 2000, three months after the Dreamcast's North American launch, Electronic Gaming Monthly wrote that "with triple-A stuff like Soul Calibur, NBA 2K, and soon Crazy Taxi to kick around, we figure you're happy you took the 128-bit plunge".[265] In a retrospective, PC Magazine's Jeffrey L. Wilson referred to Dreamcast's "killer library" and said that Sega's creative influence and visual innovation had been at its peak.[263] The staff of Edge agreed with this assessment of Dreamcast games, including Sega's arcade conversions, stating that the system "delivered the first games that could meaningfully be described as arcade perfect".[143] Damien McFerran of Retro Gamer praised Dreamcast's NAOMI arcade ports, and wrote: "The thrill of playing Crazy Taxi in the arcade knowing full well that a pixel-perfect conversion (and not some cut-down port) was set to arrive on the Dreamcast is an experience gamers are unlikely to witness again."[28]

Nick Montfort and Mia Consalvo, writing in Loading... The Journal of the Canadian Game Studies Association, argued that "the Dreamcast hosted a remarkable amount of video game development that went beyond the odd and unusual and is interesting when considered as avant-garde ... It is hard to imagine a commercial console game expressing strong resistance to the commodity perspective and to the view that game production is commerce. But even when it comes to resisting commercialization, it is arguable that Dreamcast games came closer to expressing this attitude than any other console games have."[160] 1Up.com's Jeremy Parish favorably compared Sega's Dreamcast output, which included some of "the most varied, creative, and fun [games] the company had ever produced", with its "enervated" status as a third-party.[57] Fahs noted, "The Dreamcast's life was fleeting, but it was saturated with memorable titles, most of which were completely new properties."[19] According to author Steven L. Kent, "From Sonic Adventure and Shenmue to Space Channel 5 and Seaman, Dreamcast delivered and delivered and delivered."[266]

Some journalists have compared the demise of the Dreamcast with changing trends in the video game industry. In 1001 Video Games You Must Play Before You Die, Duncan Harris wrote: "One of the reasons that older gamers mourned the loss of the Dreamcast was that it signaled the demise of arcade gaming culture ... Sega's console gave hope that things were not about to change for the worse and that the tenets of fast fun and bright, attractive graphics were not about to sink into a brown and green bog of realistic war games."[267] Jeremy Parish, writing for USgamer, contrasted the Dreamcast's diverse library with the "suffocating sense of conservatism" that pervaded the gaming industry in the following decade.[268] According to Sega's head of product implementation, Tadashi Takezaki, the Dreamcast would have been Sega's last video game console no matter how it sold because of the changes in the market and the rise of PCs. He praised the Dreamcast for its features, saying in 2013, "The seeds we sowed with the Dreamcast are finally bearing fruit at this point in time. In some ways, we were going by the seat of our pants, but it was part of the Sega credo at the time — if it's fun, then go for it."" (wikipedia.org)

"Reception and legacy
A Dreamcast European retail demo kiosk is at the Finnish Museum of Games in Tampere, Finland.

In December 1999, Next Generation rated the Dreamcast four out of five, writing: "If you want the most powerful system available now, showcasing the best graphics at a reasonable price, this system is for you." However, Next Generation gave its future prognosis three out of five, noting that Sony and Nintendo were both due to release more powerful consoles.[255] At the beginning of 2000, five Electronic Gaming Monthly reviewers scored the Dreamcast 8.5, 8.5, 8.5, 8.0, and 9.0 out of 10.[256] In 2001, the Electronic Gaming Monthly reviewers scored it 9.0, 9.0, 9.0, 9.0, and 9.5 out of 10.[257] BusinessWeek named the Dreamcast one of the best products of 1999.[258]

Reasons cited for the failure of the Dreamcast include consumer excitement for the PS2;[57][214][259] a lack of support from EA and Squaresoft, the most popular third parties in the US and Japan respectively;[144] disagreement among executives over Sega's future, and Okawa's lack of commitment to the product;[19] Sega's lack of advertising money, with Bellfield doubting that Sega spent even "half" the $100 million it had pledged to promote the Dreamcast in the US;[29][260] that the market was not ready for online gaming;[124][144] Sega's focus on "hardcore" gamers over mainstream consumers;[57][124] poor timing;[29] and damage to Sega's reputation caused by its several poorly supported previous platforms.[144][261][262] In GamePro, Blake Snow wrote of "the much beloved [Dreamcast] launched years ahead of the competition but ultimately struggled to shed the negative reputation [Sega] had gained during the Saturn, Sega 32X, and Sega CD days. As a result, casual gamers and jaded third-party developers doubted Sega's ability to deliver."[261]

Eurogamer's Dan Whitehead noted that consumers' "wait-and-see" approach, and the lack of support from EA, were symptoms rather the cause of Sega's decline. He concluded that "Sega's misadventures during the 1990s had left both gamers and publishers wary of any new platform bearing its name".[214] According to 1Up.com's Jeremy Parish, it would be intellectually dishonest to blame Sony for "killing the Dreamcast by overselling the PS2", as Sega's lack of support for previous consoles had made customers hesitant to purchase Dreamcasts.[57]

In 2009, IGN named the Dreamcast the eighth-greatest video game console, praising its software and innovations, including its online play.[39] In 2010, PC Magazine's Jeffrey L. Wilson named the Dreamcast the greatest console and said that it was "gone too soon".[263] In 2013, Edge named the Dreamcast the tenth-best console of the last 20 years, highlighting innovations including in-game voice chat, downloadable content, and second-screen technology through the use of VMUs. Edge wrote that "Sega's console was undoubtedly ahead of its time, and it suffered at retail for that reason... [b]ut its influence can still be felt today."[143] Dan Whitehead of Eurogamer likened the Dreamcast to "a small, square, white plastic JFK. A progressive force in some ways, perhaps misguided in others, but nevertheless a promising life cut tragically short by dark shadowy forces, spawning complex conspiracy theories that endure to this day." He wrote that its short lifespan "may have sealed its reputation as one of the greatest consoles ever", as "nothing builds a cult like a tragic demise".[214] According to IGN's Travis Fahs, "Many hardware manufacturers have come and gone, but it's unlikely any will go out with half as much class as Sega."[4]

    If ever a system deserved to succeed, it was Dreamcast. Dreamcast has a hell of a library. It's dying now, 18 months old, with a larger library than the 5-year-old Nintendo 64. It's a better library than the Nintendo 64. Dreamcast was a wonderful system.

—Journalist Steven L. Kent, March 2001.[264]

The Dreamcast's game library was celebrated.[261] In January 2000, three months after the Dreamcast's North American launch, Electronic Gaming Monthly wrote that "with triple-A stuff like Soul Calibur, NBA 2K, and soon Crazy Taxi to kick around, we figure you're happy you took the 128-bit plunge".[265] In a retrospective, PC Magazine's Jeffrey L. Wilson referred to Dreamcast's "killer library" and said that Sega's creative influence and visual innovation had been at its peak.[263] The staff of Edge agreed with this assessment of Dreamcast games, including Sega's arcade conversions, stating that the system "delivered the first games that could meaningfully be described as arcade perfect".[143] Damien McFerran of Retro Gamer praised Dreamcast's NAOMI arcade ports, and wrote: "The thrill of playing Crazy Taxi in the arcade knowing full well that a pixel-perfect conversion (and not some cut-down port) was set to arrive on the Dreamcast is an experience gamers are unlikely to witness again."[28]

Nick Montfort and Mia Consalvo, writing in Loading... The Journal of the Canadian Game Studies Association, argued that "the Dreamcast hosted a remarkable amount of video game development that went beyond the odd and unusual and is interesting when considered as avant-garde ... It is hard to imagine a commercial console game expressing strong resistance to the commodity perspective and to the view that game production is commerce. But even when it comes to resisting commercialization, it is arguable that Dreamcast games came closer to expressing this attitude than any other console games have."[160] 1Up.com's Jeremy Parish favorably compared Sega's Dreamcast output, which included some of "the most varied, creative, and fun [games] the company had ever produced", with its "enervated" status as a third-party.[57] Fahs noted, "The Dreamcast's life was fleeting, but it was saturated with memorable titles, most of which were completely new properties."[19] According to author Steven L. Kent, "From Sonic Adventure and Shenmue to Space Channel 5 and Seaman, Dreamcast delivered and delivered and delivered."[266]

Some journalists have compared the demise of the Dreamcast with changing trends in the video game industry. In 1001 Video Games You Must Play Before You Die, Duncan Harris wrote: "One of the reasons that older gamers mourned the loss of the Dreamcast was that it signaled the demise of arcade gaming culture ... Sega's console gave hope that things were not about to change for the worse and that the tenets of fast fun and bright, attractive graphics were not about to sink into a brown and green bog of realistic war games."[267] Jeremy Parish, writing for USgamer, contrasted the Dreamcast's diverse library with the "suffocating sense of conservatism" that pervaded the gaming industry in the following decade.[268] According to Sega's head of product implementation, Tadashi Takezaki, the Dreamcast would have been Sega's last video game console no matter how it sold because of the changes in the market and the rise of PCs. He praised the Dreamcast for its features, saying in 2013, "The seeds we sowed with the Dreamcast are finally bearing fruit at this point in time. In some ways, we were going by the seat of our pants, but it was part of the Sega credo at the time — if it's fun, then go for it." (wikipedia.org)

"The 3DO Interactive Multiplayer, also referred to as simply 3DO, is a home video game console developed by The 3DO Company. Conceived by entrepreneur and Electronic Arts founder Trip Hawkins, the 3DO was not a console manufactured by the company itself, but a set of specifications, originally designed by Dave Needle and Robert J. Mical of New Technologies Group, that could be licensed by third parties. Panasonic produced the first models in 1993, and further renditions of the hardware were released afterwards by GoldStar, Sanyo, Creative Labs, and Samsung Electronics.

Despite having a highly promoted launch (including being named Time magazine's "1993 Product of the Year"), the console received mixed to negative reviews, and an oversaturated console market prevented the system from achieving success comparable to competing consoles from Sega and Sony, rendering its discontinuation by 1996. In 1997, The 3DO Company sold its hardware business to Samsung.[8]
History

The 3DO Interactive Multiplayer was originally conceived by The 3DO Company, founded in September 12, 1991 by Electronic Arts founder Trip Hawkins. The company's objective was to create a next-generation, CD-based video game/entertainment standard which would be manufactured by various partners and licensees; 3DO would collect a royalty on each console sold and on each game manufactured. To game publishers, the low US$3 royalty rate per game was a better deal than the higher royalties paid to Nintendo and Sega when making games for their consoles. The 3DO hardware itself was designed by Dave Needle and R. J. Mical (designers of the Amiga and Atari Lynx), starting from an outline on a restaurant napkin in 1989.[9] Trip Hawkins was a long-time acquaintance of Needle and Mical and found that their design very closely fit his philosophy for architecture and approach, so he decided that: "Rather than me start a brand new team and starting from scratch it just made a lot of sense to ... join forces with them and shape what they were doing into what I wanted it to be."[9]

The 3DO Company lacked the resources to manufacture consoles, and instead licensed the hardware to other companies for manufacturing. Trip Hawkins recounted that they approached every electronics manufacturer, but that their chief targets were Sony and Panasonic, the two largest consumer electronics companies in the world.[9] However, Sony had already begun development on their own console, the PlayStation, and ultimately decided to continue work on it rather than sign with 3DO.[9] According to former Sega CEO Tom Kalinske The 3DO Company was engaged in very serious talks for Sega to become involved with the 3DO. However, it was passed on by Sega due to concerns over cost.[10] Panasonic launched the 3DO with its FZ-1 model in 1993, though Goldstar and Sanyo would later manufacture the 3DO as well. Companies who obtained the hardware license but never actually sold 3DO units include Samsung,[11] Toshiba,[12] and AT&T, who went so far as to build prototype AT&T 3DO units and display them at the January 1994 Consumer Electronics Show.[13]

Licensing to independent manufacturers made the system extremely expensive. The manufacturers had to make a profit on the hardware itself, whereas most major game console manufacturers, such as Sega and Sony, sold their systems at a loss, with expectations of making up for the loss with software sales. The 3DO was priced at US$699,[14][15] far above competing game systems and aimed at high-end users and early adopters. Hawkins has argued that 3DO was launched at $599, and not "higher myths that are often reported".[16] In a later interview, Hawkins clarified that while the suggested retail price was $699, not all retailers sold the system at that price.[9] Goldstar, Sanyo, and Panasonic's later models were less expensive to manufacture than the FZ-1 and were sold for considerably lower prices. For example, the Goldstar model launched at $399.[4] In addition, after six months on the market, the price of the FZ-1 had dropped to $499,[17][18] leading some to contend that the 3DO's cost was not as big a factor in its market failure as is usually claimed.[9]

Hawkins' belief was that the 3DO system could become a dominant standard in a similar way to that achieved by the VHS video cassette format, with several companies being able to promote the standard effectively against individual competitors with their own technologies, such as Sony and Betamax, in the context of VHS. It was also believed that companies would be able to more effectively compete by being able to leverage a common standard, as opposed to having to attract developers to individual formats, with Hawkins noting that this would be "tough for Atari and Sony". Indeed, Hawkins believed that the failure of NEC to establish its TurboGrafx system, and yet being "much bigger than Sony", illustrated the difficulties faced by new entrants to the console market and thought that Sony, in following the business model of Sega and Nintendo, "would have had a better chance if it had partnered with some of the others". Meanwhile, other products were not regarded as competitive threats: the Atari Jaguar was perceived as "primitive" and "slightly better than a 16-bit system", and the Philips CD-i was regarded as "really obsolete by today's standards". Both 3DO and Philips, seeking to pioneer the broader concept of interactive entertainment, aimed to sell in the order of one million units during 1994 and into 1995.[19]

Hawkins claimed that the console was HDTV-capable, and that the company could use its technology for a set-top box.[20] It was believed the platform would appeal to cable companies seeking to provide digital interactive services, with broadcasts being accompanied by digital information, eventually leading to the development of video-on-demand services on what was described as a "client-server interactive network", with an interactive networking trial having been announced in collaboration with US West in Omaha, Nebraska for the autumn of 1994.[19] Computer Gaming World reported in January 1994 that 3DO "is poised for an avalanche of software support to appear in the next 12 months", unlike the Atari Jaguar and Pioneer LaserActive. The magazine predicted that "If 3DO's licensees can get enough machines and software out in the market, this could very well become the interactive gamer's entry level machine" and possibly "the ideal plug and play solution for those of us who are tired of playing circuit board roulette with our personal computers".[21] Electronic Arts promoted the console in two-page advertisements, describing it as a "technological leap" and promising "twenty new titles ... over the next twelve months".[22]

The launch of the platform in October 1993 received a great deal of attention in the press as part of the "multimedia wave" in the computer world at the time. Return Fire, Road Rash, FIFA International Soccer, and Jurassic Park Interactive had been slated for launch releases but were pushed to mid-1994 due to the developers' struggles with the then-cutting-edge hardware.[9] Moreover, the 3DO Company made continued updates to the console hardware almost up to the system's release, which resulted in a number of third-party titles missing the launch date, in some cases by less than a month, because the developers weren't left enough time to fully test them on the finalized hardware.[23] The only 3DO software available at launch was the third-party game Crash 'n Burn.[9][24] Panasonic also failed to manufacture an ample supply of the console in time for launch day, and as a result most retail stores only received one or two units.[1] By mid-November, the 3DO had sold 30,000 units.[25]

The system was released in Japan in March 1994 with an initial lineup of six games. The Japanese launch was moderately successful, with 70,000 units shipping to 10,000 stores.[12] However, sales soon dropped and by 1995 the system was known in Japan as a host for pornographic releases.[26]

The 3DO's claim to the title of most advanced console on the market was lost with the 1994 Japanese launches of the Sony PlayStation and Sega Saturn. The 3DO Company responded by emphasizing their console's large existing software library, lower price (both the Panasonic and Goldstar models were $299 by this time), and promised successor: the M2.[27] To assure consumers that the 3DO would still be supported, the M2 was initially announced as an add-on for the 3DO.[28] It was later revealed that the M2 would be an entirely separate console, albeit one with 3DO backward compatibility. Eventually the M2 project was cancelled.

Unlike Panasonic, Goldstar initially produced only 3DO hardware, not software. This made it difficult to manage competitive price drops, and when the price of the Goldstar 3DO dropped to $199 in December 1995, the company took a loss of more than $100 on each sale.[29] Goldstar tried switching to the usual industry model of selling hardware at a loss and profiting on software, but though a handful of Goldstar games were published for the 3DO, Goldstar's software development operation arrived too late to allow them to turn a profit on the 3DO. This lack of a profitable business model, combined with Panasonic acquiring exclusive rights to the M2 technology, were cited as the two chief reasons for Goldstar dropping support for the 3DO in early 1996.[29] During the second quarter of 1996 several of the 3DO's most loyal software supporters, including the software division of The 3DO Company themselves, announced they were no longer making games for the system, leaving Panasonic as the only company supporting active software development for the 3DO.[30]

The 3DO system was eventually discontinued towards the end of 1996, with a complete shutdown of all internal hardware development and divestment of the M2 technology. The 3DO Company restructured themselves around this same time, selling off their hardware division to become a multi-platform company focused on software development and online gaming.[31] After selling the Opera hardware to Samsung in 1997,[8] the hardware was revived in South Korea for another 2 years then discontinued again at some point in late 1998 or early 1999. The amount of systems produced after the Samsung buyout is unknown.

The initial high price is considered to be one of the many issues that led to the 3DO's failure, along with lack of significant funding that larger companies such as Sony took advantage of.[4] In an interview shortly after The 3DO Company dropped support for the system, Trip Hawkins attributed its failure to the model of licensing all hardware manufacturing and software to third parties. He reasoned that for a console to be a success, it needed a single strong company to take the lead in marketing, hardware, and software, and pointed out that it was essentially a lack of coordination between The 3DO Company, Panasonic, and the 3DO's software developers which had led to the console launching with only one game ready." (wikipedia.org)

"PlayStation (Japanese: プレイステーション, Hepburn: Pureisutēshon, officially abbreviated as PS) is a video gaming brand that consists of five home video game consoles, two handhelds, a media center, and a smartphone, as well as an online service and multiple magazines. The brand is produced by Sony Interactive Entertainment, a division of Sony; the first PlayStation console was released in Japan in December 1994, and worldwide the following year.[1]

The original console in the series was the first console of any type to ship over 100 million units, doing so in under a decade.[2] Its successor, the PlayStation 2, was released in 2000; it is the best-selling home console to date, having reached over 155 million units sold by the end of 2012.[3] Sony's next console, the PlayStation 3, was released in 2006, selling over 87.4 million units by March 2017.[4] Sony's next console, the PlayStation 4, was released in 2013, selling a million units within a day, becoming the fastest selling console in history.[5] The latest console in the series, the PlayStation 5, was released in 2020[6] and sold 10 million units in its first 249 days, unseating its predecessor as the fastest-selling PlayStation console to-date.[7]

The first handheld console in the series, the PlayStation Portable (PSP), sold a total of 80 million units worldwide by November 2013.[8] Its successor, the PlayStation Vita (PSVita), which launched in Japan in December 2011 and in most other major territories in February 2012, sold over four million units by January 2013.[9] PlayStation TV is a microconsole and a non-portable variant of the PlayStation Vita handheld game console.[10] Other hardware released as part of the PlayStation series includes the PSX, a digital video recorder which was integrated with the PlayStation and PlayStation 2, though it was short-lived due to its high price and was never released outside Japan, as well as a Bravia television set which has an integrated PlayStation 2. The main series of controllers utilized by the PlayStation series is the DualShock, which is a line of vibration-feedback gamepads, having sold 28 million units by June 2008.[11]

The PlayStation Network is an online service with about 110 million registered users[12] (as of June 2013) and over 103 million active users monthly.[13] (as of December 2019) It comprises an online virtual market, the PlayStation Store, which allows the purchase and download of games and various forms of multimedia, a subscription-based online service known as PlayStation Plus and a social gaming networking service called PlayStation Home, which had over 41 million users worldwide at the time of its closure in March 2015.[14] PlayStation Mobile (formerly PlayStation Suite) is a software framework that provides PlayStation content on mobile devices. Version 1.xx supports both PlayStation Vita, PlayStation TV and certain devices that run the Android operating system, whereas version 2.00 released in 2014 only targeted PlayStation Vita and PlayStation TV.[15] Content set to be released under the framework consist of only original PlayStation games currently.[16]

Seventh generation PlayStation products also use the XrossMediaBar, which is an Technology & Engineering Emmy Award–winning graphical user interface.[17] A touch screen-based user interface called LiveArea was launched for the PlayStation Vita, which integrates social networking elements into the interface. Additionally, the PlayStation 2 and PlayStation 3 consoles also featured support for Linux-based operating systems; Linux for PlayStation 2 and OtherOS respectively, though this has since been discontinued. The series has also been known for its numerous marketing campaigns, the latest of which being the "Greatness Awaits" and eventually, "Play Has No Limits" commercials in the United States.

The series also has a strong line-up of first-party games due to PlayStation Studios, a group of many studios owned by Sony Interactive Entertainment that exclusively developed them for PlayStation consoles. In addition, the series features various budget re-releases of games by Sony with different names for each region; these include the Greatest Hits, Platinum, Essentials, and The Best selection of games.
History
Origins
Original PlayStation logo (1994)

PlayStation was the brainchild of Ken Kutaragi, a Sony executive who managed one of the company's hardware engineering divisions and was later dubbed "The Father of the PlayStation".[18][19]

Until 1991, Sony had little direct involvement with the video game industry. The company supplied components for other consoles, such as the sound chip for the Super Famicom from Nintendo, and operated a video game studio, Sony Imagesoft.[20] As part of a joint project between Nintendo and Sony that began as early as 1988, the two companies worked to create a CD-ROM version of the Super Famicom,[21] though Nintendo denied the existence of the Sony deal as late as March 1991.[22] At the Consumer Electronics Show in June 1991, Sony revealed a Super Famicom with a built-in CD-ROM drive that incorporated Green Book technology or CD-i, called "Play Station" (also known as SNES-CD). However, a day after the announcement at CES, Nintendo announced that it would be breaking its partnership with Sony, opting to go with Philips instead but using the same technology.[23] The deal was broken by Nintendo after they were unable to come to an agreement on how revenue would be split between the two companies.[23] The breaking of the partnership infuriated Sony President Norio Ohga, who responded by appointing Kutaragi with the responsibility of developing the PlayStation project to rival Nintendo.[23]
The sole remaining prototype of Sony's original "PlayStation", a Super NES with a built-in CD-ROM drive

At that time, negotiations were still on-going between Nintendo and Sony, with Nintendo offering Sony a "non-gaming role" regarding their new partnership with Philips. This proposal was swiftly rejected by Kutaragi who was facing increasing criticism over his work with regard to entering the video game industry from within Sony. Negotiations officially ended in May 1992 and in order to decide the fate of the PlayStation project, a meeting was held in June 1992, consisting of Sony President Ohga, PlayStation Head Kutaragi and several senior members of Sony's board. At the meeting, Kutaragi unveiled a proprietary CD-ROM-based system he had been working on which involved playing video games with 3D graphics to the board. Eventually, Sony President Ohga decided to retain the project after being reminded by Kutaragi of the humiliation he suffered from Nintendo. Nevertheless, due to strong opposition from a majority present at the meeting as well as widespread internal opposition to the project by the older generation of Sony executives, Kutaragi and his team had to be shifted from Sony's headquarters to Sony Music, a completely separate financial entity owned by Sony, so as to retain the project and maintain relationships with Philips for the MMCD development project (which helped lead to the creation of the DVD).[23]

According to SCE's producer Ryoji Akagawa and chairman Shigeo Maruyama, there was uncertainty over whether the console should primarily focus on 2D sprite graphics or 3D polygon graphics. Eventually, after witnessing the success of Sega's Virtua Fighter in Japanese arcades, that Sony realized "the direction of the PlayStation became instantly clear" and 3D polygon graphics became the console's primary focus.[24]

The PlayStation logo was designed by Manabu Sakamoto. He wanted the logo to capture the 3D support of the console, but instead of just adding apparent depth to the letters "P" and "S", he created an optical illusion that suggested the letters in depth of space. Sakamoto also stuck with four bright principal colors, red, yellow, green, and blue, only having to tune the green color for better harmony across the logo. Sakamoto also designed the black and white logo based on the same design, reserved for times where colors could not be used.[25]
Formation of Sony Computer Entertainment

At Sony Music Entertainment, Kutaragi worked closely with Shigeo Maruyama, the CEO of Sony Music, and with Akira Sato to form Sony Computer Entertainment Inc. (SCEI) on November 16, 1993.[26] A building block of SCEI was its initial partnership with Sony Music which helped SCEI attract creative talent to the company as well as assist SCEI in manufacturing, marketing and producing discs, something that Sony Music had been doing with Music Discs. The final two key members of SCEI were Terry Tokunaka, the President of SCEI from Sony's headquarters, and Olaf Olafsson. Olafsson was CEO and president of New York-based Sony Interactive Entertainment[27] which was the parent company for the 1994-founded Sony Computer Entertainment of America (SCEA).

The PlayStation project, SCEI's first official project, was finally given the green light by Sony executives in 1993 after a few years of development. Also in 1993, Phil Harrison, who later became President of SCE Worldwide Studios, was recruited into SCEI to attract developers and publishers to produce games for their new PlayStation platform.[23]

Computer Gaming World in March 1994 reported a rumor that the "Sony PS-X" would be released in Japan "before the end of this year and will retail for less than $400".[28] After a demonstration of Sony's distribution plan as well as tech demos of its new console to game publishers and developers in a hotel in Tokyo in 1994, numerous developers began to approach PlayStation. Two of whom later became major partners were Electronic Arts in the West and Namco in Japan. One of the factors which attracted developers to the platform was the use of a 3D-capable, CD-ROM-based console which was much cheaper and easier to manufacture for in comparison to Nintendo's rival console, which used cartridge systems. The project eventually hit Japanese stores in December 1994 and gained massive sales due to its lower price point than its competitor, the Sega Saturn. The popularity of the console spread after its release worldwide in North America and Europe." (wikipedia.org)

"The PlayStation[a] (abbreviated as PS, commonly known as the PS1/PS one or its codename PSX) is a home video game console developed and marketed by Sony Interactive Entertainment. It was released in Japan on 3 December 1994, in North America on 9 September 1995, in Europe on 29 September 1995, and in Australia on 15 November 1995. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn.

Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third-party developers.

The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Premier PlayStation franchises included Gran Turismo, Crash Bandicoot, Spyro, Tomb Raider, Metal Gear, Tekken, and Final Fantasy, all of which spawned numerous sequels. PlayStation games continued to sell until Sony ceased production of the PlayStation and its games on 23 March 2006—over eleven years after it had been released, and less than a year before the debut of the PlayStation 3.[10] A total of 3,061 PlayStation games were released, with cumulative sales of 967 million units.

The PlayStation signalled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units.[13] Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS One.
History
Background
A photo of the only-known SNES-based PlayStation prototype with a controller and disk drive in the foreground.
A photo of the only known SNES-based PlayStation prototype[14]

The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation".[15][16] Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom.[17] Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities.[18] His willingness to work with Nintendo derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems.[19] Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge,[20] president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé.[17]

The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony.[8] Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES.[17][21] Since Sony was already contracted to produce the SPC-700 sound processor for the SNES,[8] Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD".[22][23] The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative.[24]

Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the ill-fated MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware.[25][26] Although the initial agreement between Nintendo and Sony was about producing a CD-ROM add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design.[8][27] Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market.[8][28][26] Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application.[29]

The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas.[30] However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department.[31] Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony.[32][33][30] He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines.[34][26]

Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail.[35] On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry,[34] Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony.[17][36][37]
Inception
Ken Kutaragi pictured in 2014 at the Game Deveolopers Choice Awards ceremony. Kutaragi is standing on stage, holding an award in his right hand.
Ken Kutaragi, the "Father of the PlayStation", pictured at the Game Developers Choice Awards in 2014

Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console.[38] Nintendo's contract-breaking was met with consternation in the Japanese business community,[17] as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones.[26] Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but their board of directors in Tokyo vetoed the idea when American CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?"[39] Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES.[39]

Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development.[26][40] Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company.[41] Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work.[17][34] Knowing that it had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992.[42]

To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time.[43] Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers.[44] The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations.[45] After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters.[23][46]

Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ),[47] a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project.[44] The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice presidents of the division that ran the PlayStation business.[28] Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry.[48][49] On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation.[34][50] According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus.[51] SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible.[52] Maruyama claimed that Sony further wanted to emphasize the new console's ability to utilize redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay.[53]

Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX).[34] Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995.[54][55] The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch.[56] The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy".[28][29]
Development

Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles.[57] Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers.[26] A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal.[58] Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was keen to participate in the PlayStation project as a third-party developer since Namco rivalled Sega in the arcade market.[59] Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995),[26][7] Ridge Racer being one of the most popular arcade games at the time,[60] and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993.[61] Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter.[62] The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994.[63]
A photo of Ian Hetherington, founder of game developer Psygnosis, seated at a desk.
Ian Hetherington pictured in 1990. Hetherington and Psygnosis played important roles in the PlayStation project.

Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America.[26][7] Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it.[64] Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced.[65] In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development.[64][66]

The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system.[28] Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES.[67] When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems', thus securing a cheaper and more efficient method for designing software.[26] An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger.[68] SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005.[69]

Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, the then-representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California and Tokyo housed technical support teams that could work closely with third-party developers if needed.[49][70] Peter Molyneux, who owned Bullfrog Productions at the time, admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have".[71] Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. In contrast to other disc-reading consoles such as the 3DO, the PlayStation could quickly generate and synthesise data from the CD since it was an image-generation system, rather than a data-replay system.[72]

The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful during the early stages of development as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Sony used the free software GNU C compiler, also known as GCC, to guarantee short debugging times as it was already familiar to many programmers.[66] Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase.[73] An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction.[74] Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down.[75] Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard.[75]

Its technical specifications were finalised in 1993 and its design during 1994.[76] The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet.[77]
Launch

Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800.[7][78] Sales in Japan began with a "stunning"[17] success with long queues in shops.[26] It sold 100,000 units on the first day[79] and two million units within six months,[80] although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter.[7][81] By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units.[82] After a while, a grey market emerged for PlayStations, which were shipped from Japan to North America and Europe, with some buyers of such consoles paying large amounts of money in the range of £700.[79]

    "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock.

—Lee Hutchinson of Ars Technica, a Babbage's employee in 1995, recalling how PlayStation preorders greatly outnumbered Saturn sales at his shop.[83]

Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that its Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause.[84][85][86][87] The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994).[88][89][90] In addition, Sony announced that no games would be bundled with the console.[26][91]

Although the Saturn had released early in the United States to gain an advantage over the PlayStation,[92] the surprise launch upset many retailers who were not informed in time, harming sales.[93] Some retailers such as KB Toys responded by dropping the Saturn entirely.[94] The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories.[26] The well-received Ridge Racer contributed to the PlayStation's early success,[81][95][96] with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994).[97][98] There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch,[26] in comparison to the Saturn's six launch games.[99]

The PlayStation released in Europe on 29 September 1995[3] and in Australia on 15 November 1995.[4] By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million.[100][101] Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos.[64] Within its first year, the PlayStation secured over 20% of the entire American video game market.[102] From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b][104] though the SNES and Mega Drive from the fourth generation still outsold it.[105] Sony reported that the attach rate of sold games and consoles was four to one.[106] To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America.[107] By early 1996, the PlayStation had grossed $2 billion (equivalent to $3.732 billion 2022) from worldwide hardware and software sales.[108] By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK.[109] Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64.[110]
Marketing success and later years

The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America.[111] Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop.[112] While the Saturn was positioned towards 18- to 34-year-olds,[113] the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games.[114] The console was marketed with advertising slogans stylised as "LIVE IN YOUR WORLD. PLAY IN OURS" and "U R NOT E" (red E).[115][26] Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'"[116] As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well.[117]

Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers.[118] Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested.[119] Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience.[120] Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues.[119][121] By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing.[118]

In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs.[122] This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year.[123] The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996,[124] with 2.2 million consoles sold in the region by the end of the year.[125] Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64.[126][127] Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts.[128] Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time.[112]

By 1998, Sega, encouraged by their declining market share and significant financial losses,[129] launched the Dreamcast as a last-ditch attempt to stay in the industry.[130] Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry.[131][132] Sony still held 60% of the overall video game market share in North America at the end of 1999.[133] Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected,[134] with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software.[135] On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers.[136][137] The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2.[138] The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely.[132] The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3....Game library
See also: List of PlayStation games (A–L), List of PlayStation games (M–Z), List of best-selling PlayStation video games, and List of cancelled PlayStation video games

A total of 7,918 PlayStation games have been released worldwide.[193] The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units.[11] After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units.[194]

The PlayStation featured a diverse game library which grew to appeal to all types of players. The first two games available at launch were Jumping Flash! (1995) and Ridge Racer,[195][196] with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming.[197] Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan,[198] and is considered one of the most influential and greatest video games ever made.[199]

At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue;[65] its breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community.[200] Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996,[201] with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion.[202][203] Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan.[204] Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2.[79][142]

Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format.[205]
Reception

The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war".[206] Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo.[207] Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995.[208]

In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising its stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors.[209]
Legacy

SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo.[210] By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third.[211]

The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide,[8][212] with many critics attributing the console's success to third-party developers.[83] It remains the fifth best-selling console of all time as of 2023, with a total of 102.49 million units sold.[212] Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console.[8] Its success resulted in a significant financial boon for Sony as profits from its video game division contributed to 23%.[213]

Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor,[211] which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube.[214][215][216] The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market.[217][218] To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. Hundreds of PlayStation games were re-released as PS One Classics for purchase and download on the PlayStation Portable, PlayStation 3, and PlayStation Vita.[219] The PlayStation 2 and PlayStation 3 also maintained backward compatibility with original PlayStation discs.[220]

The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible".[221] In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format.[222] Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s".[223]
CD format

The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64.[c][215] After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; it was likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given its substantial reliance on licensing and exclusive games for its revenue.[225]

Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months.[226][227] Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time.[228][229] The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs.[112]

Tokunaka remarked in 1996:

    Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation.[112]

The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand.[215] As a result, some third-party developers switched to the PlayStation, including Square, whose Final Fantasy VII, and Enix (later merged with Square to form Square Enix), whose Dragon Quest VII (2000) were planned for the Nintendo 64.[230][231] Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo itself or second-parties such as Rare." (wikipedia.org)

"Optical disc packaging is the packaging that accompanies CDs, DVDs, and other formats of optical discs. Most packaging is rigid or semi-rigid and designed to protect the media from scratches and other types of exposure damage....

A jewel CD case is a compact disc case that has been used since the compact disc was first released in 1982. It is a three-piece plastic case, measuring 142 by 125 by 10 millimetres (5.59 in × 4.92 in × 0.39 in), a volume of 177.5 cubic centimetres (10.83 cu in), which usually contains a compact disc along with the liner notes and a back card. Two opposing transparent halves are hinged together to form the casing, the back half holding a media tray that grips the disc by its hole. All three parts are made of injection-moulded polystyrene.[1]

The front lid contains two, four, or six tabs to keep any liner notes in place. The liner notes typically will be a 120 by 120 millimetres (4.7 in × 4.7 in) booklet, or a single 242 by 120 millimetres (9.5 in × 4.7 in) leaf folded in half. In addition, there is usually a back card, 150 by 118 millimetres (5.9 in × 4.6 in), underneath the media tray and visible through the clear back, often listing the track names, studio, copyright data and other information. The back card is folded into a flattened "U" shape, with the sides being visible along the ends (often referred to as the spine) of the case. The ends usually have the name of the release and the artist, and often label or catalogue information printed on them, and are designed to be visible when the case is stored vertically, 'book-style', on shelves.[citation needed]

The back media tray snaps into the back cover, and is responsible for securing the disk. The center is a circular hub of teeth which grip the disc by its hole. This effectively suspends the disk in the middle of the container, preventing the recording surface from being scratched.[1] The media tray was originally constructed of a flexible black polystyrene, but many newer trays use a more fragile transparent polystyrene. This allows the reverse of the back card, which is usually used for additional artwork, to be visible. This format did not become common until the mid-1990s.[citation needed]

The jewel case is the standard case used by the majority of manufacturers and it is the most common type of case found in record and movie stores. Jewel cases are occasionally used for DVDs, but generally not for those that contain major film releases. Blank Blu-ray Disc media is also most commonly sold in standard-width jewel cases....

According to Philips, the name "jewel case" reflects either the generally high quality of the case design compared to initial attempts, or its appearance. According to one publication,[1] initial attempts at packaging CDs were unsatisfactory. When the new design, by Peter Doodson, was found to be "virtually perfect" it was dubbed the "jewel case".[1] Another publication[2] quotes Doodson describing that he "specified polished ribs as they pick up the light and shine" and states that the resulting appearance led to the name....

Endurance: The CD jewel case has a tight and firm grip of the CD because of the tray's "teeth" or "lock". Because of this, even if the CD jewel case is turned upside-down, left, or right, the CD is held in place. Flimsier cases may cause the CD to become loose, or even fall out. Also, since the jewel case is made of plastic, it is sturdier compared to cardboard, paper, or foams. When pressure is applied to the CD jewel case, the case will break first before the CD. If the case is made of thin cardboard, there is a greater chance that the CD would break or get damaged because the weight is directed onto it.[citation needed]

Storage: The type of material of the CD jewel case allows storage of CDs for decades without ruining the CDs. The same is not as true with other cases, since paper can stick to the CDs due to air, humidity, and other factors. The CD jewel case may also be preferred because it offers orderliness on a shelf. Since the CD jewel case has existed for decades, there are many CD shelves, racks, and other products in the market that are made for CD jewel cases...

Double disc albums can either be packaged in standard-thickness jewel cases with hinged media trays which can be lifted to reveal the second disc (trays hinged on the left are known as "Smart Tray" format; those hinged on the right are known as "Brilliant Box" format) or in a "double jewel case", sometimes called a multi-CD jewel case, "fatbox", or "Bookbox", which is slightly larger than two normal jewel cases stacked on top of each other, and can hold 2 to 6 CDs. Double jewel cases do not fit in some CD racks; however, some racks have a few extra wide slots specifically to accommodate them.[citation needed]

Jewel cases for CDs released early in the format's life featured frosted top and bottom spines as opposed to the ridged ones more commonly used. As a result of their rarity, these types of jewel cases are fairly coveted among collectors.[3]

"A personal computer game, also known as computer game or abbreviated PC game, is a electronic game played on a personal computer (PC) and form of video game. They are defined by the open platform nature of PC systems.

Mainframe and minicomputer games are a precursor to personal computer games. Home computer games became popular following the video game crash of 1983, leading to the era of the "bedroom coder".[dubious – discuss] In the 1990s, PC games lost mass market traction to console games on the fifth generation such as the Sega Saturn, Nintendo 64 and PlayStation.[citation needed] They are enjoying a resurgence in popularity since the mid-2000s through digital distribution on online service providers.[1][2] Personal computers as well as general computer software are considered synonymous with IBM PC compatible systems; while mobile devices – smartphones and tablets, such as those running on Android or iOS platforms – are also PCs in the general sense as opposed to console or arcade machine. Microsoft Windows utilizing Direct3D has become the most popular operating system for PC games. Games utilizing 3D graphics generally require a form of graphics processing unit, and PC games have been a major influencing factor for the development and marketing of graphics cards. Emulators are able to play games developed for other platforms. The demoscene originated from computer game cracking.

The uncoordinated nature of the PC game market make precisely assessing its size difficult.[1] PC remains the most important gaming platform with 60% of developers being most interested in developing a game for the platform and 66% of developers currently developing a game for PC.[3][better source needed] In 2018, the global PC games market was valued at about $27.7 billion.[4][better source needed] According to research data provided by Statista in 2020 there were an estimated 1.75 billion PC gamers worldwide, up from 1.5 billion PC gaming users in the previous year.[5][better source needed] Newzoo reports that the PC gaming sector is the third-largest category across all platforms as of 2016, with the console sector second-largest, and mobile gaming sector biggest. 2.2 billion video gamers generate US$101.1 billion in revenue, excluding hardware costs. "Digital game revenues will account for $94.4 billion or 87% of the global gaming market.[6][7][better source needed] The APAC region was estimated to generate $46.6 billion in 2016, or 47% of total global video game revenues (note, not only "PC" games). China alone accounts for half of APAC's revenues (at $24.4 billion), cementing its place as the largest video game market in the world, ahead of the US's anticipated market size of $23.5 billion....Growth of IBM PC compatible games

Among launch titles for the IBM Personal Computer (PC) in 1981 was Microsoft Adventure, which IBM described as bringing "players into a fantasy world of caves and treasures".[21] BYTE that year stated that the computer's speed and sophistication made it "an excellent gaming device", and IBM and others sold games like Microsoft Flight Simulator. The PC's CGA graphics and speaker sound were poor, however, and most customers bought the powerful but expensive computer for business.[22][23] One ComputerLand owner estimated in 1983 that a quarter of corporate executives with computers "have a game hidden somewhere in their drawers",[24] and InfoWorld in 1984 reported that "in offices all over America (more than anyone realizes) executives and managers are playing games on their computers",[25] but software companies found selling games for the PC difficult; an observer said that year that Flight Simulator had sold hundreds of thousands of copies because customers with corporate PCs could claim that it was a "simulation".[26]

From mid-1985, however, what Compute! described as a "wave" of inexpensive IBM PC clones from American and Asian companies, such as the Tandy 1000, caused prices to decline; by the end of 1986, the equivalent to a $1600 real IBM PC with 256K RAM and two disk drives cost as little as $600, lower than the price of the Apple IIc. Consumers began purchasing DOS computers for the home in large numbers. While often purchased to do work on evenings and weekends, clones' popularity caused consumer-software companies to increase the number of IBM-compatible products, including those developed specifically for the PC as opposed to porting from other computers. Bing Gordon of Electronic Arts reported that customers used computers for games more than one fifth of the time whether purchased for work or a hobby, with many who purchased computers for other reasons finding PC games "a pretty satisfying experience".[27]

By 1987, the PC market was growing so quickly that the formerly business-only computer had become the largest and fastest-growing, and most important platform for computer game companies. DOS computers dominated the home, supplanting Commodore and Apple. More than a third of games sold in North America were for the PC, twice as many as those for the Apple II and even outselling those for the Commodore 64.[28] With the EGA video card, an inexpensive clone had better graphics and more memory for games than the Commodore or Apple,[29][30] and the Tandy 1000's enhanced graphics, sound, and built-in joystick ports made it the best platform for IBM PC-compatible games before the VGA era.[23]

By 1988, the enormous popularity of the Nintendo Entertainment System had greatly affected the computer-game industry. A Koei executive claimed that "Nintendo's success has destroyed the [computer] software entertainment market". A Mindscape executive agreed, saying that "Unfortunately, its effect has been extremely negative. Without question, Nintendo's success has eroded software sales. There's been a much greater falling off of disk sales than anyone anticipated." A third attributed the end of growth in sales of the Commodore 64 to the console, and Trip Hawkins called Nintendo "the last hurrah of the 8-bit world". Experts were unsure whether it affected 16-bit computer games,[31] but Hawkins, in 1990, nonetheless had to deny rumors that Electronic Arts would withdraw from computers and only produce console games.[32] By 1993, ASCII Entertainment reported at a Software Publishers Association conference that the market for console games ($5.9 billion in revenue) was 12 times that of the computer-game market ($430 million).[33]

However, computer games did not disappear. By 1989, Computer Gaming World reported that "the industry is moving toward heavy use of VGA graphics".[34] While some games were advertised with VGA support at the start of the year, they usually supported EGA graphics through VGA cards. By the end of 1989, however, most publishers moved to at supporting at least 320x200 MCGA, a subset of VGA.[35] VGA gave the PC graphics that outmatched the Amiga. Increasing adoption of the computer mouse, driven partially by the success of adventure games such as the highly successful King's Quest series, and high resolution bitmap displays allowed the industry to include increasingly high-quality graphical interfaces in new releases.

Further improvements to game artwork and audio were made possible with the introduction of FM synthesis sound. Yamaha began manufacturing FM synth boards for computers in the early-mid-1980s, and by 1985, the NEC and FM-7 computers had built-in FM sound.[18] The first PC sound cards, such as AdLib's Music Synthesizer Card, soon appeared in 1987. These cards allowed IBM PC compatible computers to produce complex sounds using FM synthesis, where they had previously been limited to simple tones and beeps. However, the rise of the Creative Labs Sound Blaster card, released in 1989, which featured much higher sound quality due to the inclusion of a PCM channel and digital signal processor, led AdLib to file for bankruptcy by 1992. Also in 1989, the FM Towns computer included built-in PCM sound, in addition to a CD-ROM drive and 24-bit color graphics.[18]

In the late 80s and throughout the entire 1990s decade, DOS was one of the most popular gaming platforms in regions where it was officially sold.[20]

By 1990, DOS was 65% of the computer-game market, with the Amiga at 10%; all other computers, including the Apple Macintosh, were below 10% and declining. Although both Apple and IBM tried to avoid customers associating their products with "game machines", the latter acknowledged that VGA, audio, and joystick options for its PS/1 computer were popular.[36] In 1991, id Software produced an early first-person shooter, Hovertank 3D, which was the company's first in their line of highly influential games in the genre. There were also several other companies that produced early first-person shooters, such as Arsys Software's Star Cruiser,[37] which featured fully 3D polygonal graphics in 1988,[38] and Accolade's Day of the Viper in 1989. Id Software went on to develop Wolfenstein 3D in 1992, which helped to popularize the genre, kick-starting a genre that would become one of the highest-selling in modern times.[39] The game was originally distributed through the shareware distribution model, allowing players to try a limited part of the game for free but requiring payment to play the rest, and represented one of the first uses of texture mapping graphics in a popular game, along with Ultima Underworld.[40]

In December 1992, Computer Gaming World reported that DOS accounted for 82% of computer-game sales in 1991, compared to Macintosh's 8% and Amiga's 5%. In response to a reader's challenge to find a DOS game that played better than the Amiga version the magazine cited Wing Commander and Civilization, and added that "The heavy MS-DOS emphasis in CGW merely reflects the realities of the market".[41] A self-reported Computer Gaming World survey in April 1993 similarly found that 91% of readers primarily used IBM PCs and compatibles for gaming, compared to 6% for Amiga, 3% for Macintosh, and 1% for Atari ST,[42] while a Software Publishers Association study found that 74% of personal computers were IBMs or compatible, 10% Macintosh, 7% Apple II, and 8% other. 51% of IBM or compatible had 386 or faster CPUs.[33]

By 1992, DOS games such as Links 386 Pro supported Super VGA graphics.[43] While leading Sega and Nintendo console systems kept their CPU speed at 3–7 MHz, the 486 PC processor ran much faster, allowing it to perform many more calculations per second. The 1993 release of Doom on the PC was a breakthrough in 3D graphics, and was soon ported to various game consoles in a general shift toward greater realism.[44] Computer Gaming World reiterated in 1994, "we have to advise readers who want a machine that will play most of the games to purchase high-end MS-DOS machines".[45]

By 1993, PC floppy disk games had a sales volume equivalent to about one-quarter that of console game ROM cartridge sales. A hit PC game typically sold about 250,000 disks at the time, while a hit console game typically sold about 1 million cartridges.[46]

By spring 1994, an estimated 24 million US homes (27% of households) had a personal computer. 48% played games on their computer; 40% had the 486 CPU or higher; 35% had CD-ROM drives; and 20% had a sound card.[47] Another survey found that an estimated 2.46 million multimedia computers had internal CD-ROM drives by the end of 1993, an increase of almost 2,000%. Computer Gaming World reported in April 1994 that some software publishers planned to only distribute on CD as of 1995.[48] CD-ROM had much larger storage capacity than floppies, helped reduce software piracy, and was less expensive to produce. Chris Crawford warned that it was "a data-intensive technology, not a process-intensive one", tempting developers to emphasize the quantity of digital assets like art and music over the quality of gameplay; Computer Gaming World wrote in 1993 that "publishers may be losing their focus". While many companies used the additional storage to release poor-quality shovelware collections of older software, or "enhanced" versions of existing ones[49]—often with what the magazine mocked as "amateur acting" in the added audio and video[48]—new games such as Myst included many more assets for a richer game experience.

Many companies sold "multimedia upgrade kits" that bundled CD drives, sound cards, and software during the mid-1990s, but device drivers for the new peripherals further depleted scarce RAM.[50] By 1993, PC games required much more memory than other software, often consuming all of conventional memory, while device drivers could go into upper memory with DOS memory managers. Players found modifying CONFIG.SYS and AUTOEXEC.BAT files for memory management cumbersome and confusing, and each game needed a different configuration. (The game Les Manley in: Lost in L.A. satirizes this by depicting two beautiful women exhaust the hero in bed, by requesting that he again explain the difference between extended and expanded memory.) Computer Gaming World provided technical assistance to its writers to help install games for review,[51] and published sample configuration files.[52] The magazine advised non-technical gamers to purchase commercial memory managers like QEMM and 386MAX[50] and criticized nonstandard software like Origin Systems's "infamous late and unlamented Voodoo Memory Manager",[53] which used unreal mode.
Contemporary PC gaming
See also: Games for Windows
Logo used by majority of PC games sold in a DVD format
PC Game logo found on most contemporary box arts and trailers

By 1996, the growing popularity of Microsoft Windows simplified device driver and memory management. The success of 3D console titles such as Super Mario 64 and Tomb Raider increased interest in hardware accelerated 3D graphics on PCs, and soon resulted in attempts to produce affordable products with the ATI Rage, Matrox Mystique, S3 ViRGE, and Rendition Vérité.[54] As 3D graphics libraries such as DirectX and OpenGL matured and knocked proprietary interfaces out of the market, these platforms gained greater acceptance in the market, particularly with their demonstrated benefits in games such as Unreal.[55] However, major changes to the Microsoft Windows operating system, by then the market leader, made many older DOS-based games unplayable on Windows NT, and later, Windows XP (without using an emulator, such as DOSBox).[56][57]

The faster graphics accelerators and improving CPU technology resulted in increasing levels of realism in computer games. During this time, the improvements introduced with products such as ATI's Radeon R300 and NVidia's GeForce 6 Series have allowed developers to increase the complexity of modern game engines. PC gaming currently tends strongly toward improvements in 3D graphics.[58]

Unlike the generally accepted push for improved graphical performance, the use of physics engines in computer games has become a matter of debate since announcement and 2005 release of the nVidia PhysX PPU, ostensibly competing with middleware such as the Havok physics engine. Issues such as difficulty in ensuring consistent experiences for all players,[59] and the uncertain benefit of first generation PhysX cards in games such as Tom Clancy's Ghost Recon Advanced Warfighter and City of Villains, prompted arguments over the value of such technology.[60][61]

Similarly, many game publishers began to experiment with new forms of marketing. Chief among these alternative strategies is episodic gaming, an adaptation of the older concept of expansion packs, in which game content is provided in smaller quantities but for a proportionally lower price. Titles such as Half-Life 2: Episode One took advantage of the idea, with mixed results rising from concerns for the amount of content provided for the price." (wikipedia.org)

"Super Jewel Box" is a more advanced design which offers amongst other improvements a greatly strengthened hinge area. The depth of the disc tray is also greater, allowing for two discs to be placed on top of each other. The super jewel box cannot be used as a direct replacement for the older jewel case design as its card insert for the back is slightly different in size and shape. The super jewel box was developed by Philips[1] and it was intended to be successor to the original jewel case. Some CD manufacturers (for example the high-end company Linn) are supplying them. The super jewel box is the conventional case for Super Audio CD (SACD) releases;[1] a taller "Plus" size, midway between CD and DVD-Video size, is the conventional case for DVD-Audio, and as of mid-2006, the case format for all albums released by the Universal Music Group in Europe.[4]

Many alternatives to the standard jewel case may also be found, including larger DVD-style cases with a more book-like shape. It is not uncommon to find CDs housed in custom cases, tins and boxes of varying shapes and sizes. Slipcases and other envelope-type designs are also occasionally used.[citation needed]

Some DualDiscs are packaged in jewel cases of a somewhat different design from the CD version; the inside edge is rounded instead of flat, and the physical position of the disc is moved slightly toward the spine to make room for a latch mechanism. The overall dimensions of a DualDisc case are roughly the same as a standard CD case. However, the hinge mechanism is smaller and cannot be dismantled as easily as on a standard jewel case.[citation needed]

Smaller jewel cases are used for 8 cm CD and DVD media; similar cases without the hub are used for MiniDisc and (magnetic) Zip disk media.[citation needed]

Additionally, larger jewel cases that were around the size of VHS keep cases were used for North American releases of games for the Sega CD, all North American releases of Sega Saturn games, and games released early in the original PlayStation's life cycle. Because the larger thickness of these cases put the CDs inside at greater risk of being accidentally knocked out of their hubs, large foam bricks were placed on top of the discs when packaged to hold them in place....

Slimline jewel cases first gained popularity as cases for CD singles sold in Japan and Europe, and have become a common space-saving packaging for burned CDs. The cases used for CD Singles sold in Japan and Europe are 7 mm thick, with a "J-card" type inlay, showing cover art through the front of the case, and also through both the spine and part of the back of the case. The CD itself is usually inserted "upside-down" in the case, so that the artwork on the disc itself shows through the transparent back of the case.[citation needed]

Most slim jewel cases sold for burned CDs use the measure 142 by 125 by 5 millimetres (5.59 in × 4.92 in × 0.20 in), which is roughly half the thickness of a standard CD jewel case, allowing twice as many CDs to be stored in the same space, and will generally fit two to a slot in a standard CD rack. They generally do not have room for a full package insert booklet, only a slip of paper for a track listing or cover art, showing only through the front of the case. Unlike the standard jewel cases, slimline cases are made of two pieces rather than three and do not have a place for a back label. However, with this design the "spine" is narrower, making the discs more difficult to identify when stored on edge on a shelf.[citation needed]

The bulk of slimline cases are made with translucent or transparent polystyrene, and are available in multiple colors. A stronger alternative is made from semi-opaque, semi-flexible polypropylene which is strong enough to protect the disc, but flexible enough not to break easily. Also, the hinge mechanism is inverted compared to the standard-width case, with the pivot arms being attached to the lower part of the case rather than the clear cover side....

In the U.S. and Canada, the jewel box of a music CD was originally packaged for retail sale in a large cardboard box called a longbox in order to fit in store fixtures designed for vinyl records, offer larger space for display of artwork and marketing blurbs, and deter shoplifting. This box also enabled censorship if the store deemed a particular album cover potentially offensive to the public. This packaging was much-criticized as environmentally wasteful, and was eventually dropped by most retailers in the mid-1990s, though major record companies continued to ship CDs to wholesale clubs, such as Costco and Sam's, in longboxes into the 21st century.

Around 1994, the top wrap-around label sticker began to appear on most CDs, to make it easier to read what each CD was from the top without having to flip through them to see the front cover. These stickers were usually nothing more than informational labels and rarely would have any use in the marketing of the album. The wrap-around sticker also provided an extra seal, possibly as another theft deterrent.

A chiefly Japanese packaging addition is the inclusion of an obi strip, a J-card-esque paperboard slip wound around the left side of the case to show details such as the price, artist, etc. The obi strip is particularly useful in the case of Japanese releases of western artists' material, due to the fact that the cover artwork is unaltered from its original-language release....

The simplest, least expensive package is a paper envelope. More expensive versions add a transparent window to the envelope allowing the disc label to be seen. The envelope can also be made out of spunbonded polyethylene (trade-named Tyvek). This is both more durable and less abrasive than paper. However, such packaging is rare for commercial releases due to its relative lack of protection compared with other designs, and is primarily limited to promotional and demo discs.

It is also often used in software packages, where the box is labeled promotionally, but the disc comes in a paper sleeve (to cut costs)....

The Q Pack was developed by the Queens Group Inc. in the mid-1990s as an alternative to regular CD jewel cases. (The Queens Group was purchased by Shorewood Packaging, who are part of International Paper). The Q Pack does not have a snap-in tray like a regular jewel case. It is characterized by the corrugated raised area where the top hinges to the back. Since Q Pack cases are not transparent, generally cover art is applied as a decal to the cover. Decals can also be applied to the inside front, on the tray underneath the hub and the back cover. A slot for an insert booklet is found inside the front cover as on typical jewel cases.

The Q Pack has become one of the calling cards of No Limit Records, who used it often in the mid-to-late 1990s....

The term digipak (trademarked term)/digipack (generic term) refers to a particular type of CD case which essentially consists of a plastic CD tray glued inside a folding rectangle cardboard cover. Variations include where the disc sits on a hub or spindle inside. Though it once referred specifically to the patented Digi-Pak packaging, the term has since become genericized as digipack. Despite being made of paper, they were once considered an environmentally more friendly alternative to jewel cases though are vulnerable to shelf wear unless stored properly in plastic sleeves.

A taller form has been used for some DVD movie releases; it is essentially identical to the CD package, though with raised top and bottom sections to keep the disc from sliding out if it comes disengaged from the hub....

A mini LP sleeve (a.k.a. paper sleeve or cardboard sleeve) is a square cardboard package that looks like a miniaturized version of an LP jacket. The disc slides into the jacket either into a pocket or any other opening. Mini LP sleeves can either appear as standalone cardboard sleeves or gatefolds, identically to full-sized LP jackets, with both variants being used for a number of music releases. Some gatefold sleeves are designed to fit similar (if not identical) facial dimensions to a jewel case or digipak, allowing them to be more neatly stored on shelves alongside CDs packaged in the latter two formats; sleeves of this type are occasionally given the alternate name of "digisleeves".

While used in a somewhat limited capacity in the west, where the jewel case remains the most popular form of CD packaging, mini LP sleeves are popular for reissues of older albums in Japan, with their typically high level of faithfulness to the original vinyl record packaging making them highly sought-after among collectors." (wikipedia.org)

"Video game packaging refers to the physical storage of the contents of a PC or console game, both for safekeeping and shop display. In the past, a number of materials and packaging designs were used, mostly paperboard or plastic. Today, most console and PC games are shipped in (CD) jewel cases or (DVD) keep cases, with little differences between them.

Aside from the actual game, many items may be included inside, such as an instruction booklet, teasers of upcoming games, subscription offers to magazines, other advertisements, or any hardware that may be needed for any extra features of the game....

As PC games migrated to CDs in jewel cases, the large format box remained, though to reduce printing costs, manuals came on the CD as well as with the CD (inside the front cover), as did many of the copy-protection techniques in the form of SafeDisc and SecuROM. Despite the CD jewel case format having been around since the invention of the music CD, very few full-price PC games were released in a jewel case only. A thicker variation with space for a thick manual was, however, used for most PlayStation and Dreamcast games.

Around 2000, PC game packaging in Europe began to converge with that of PS2 (and later, Xbox and Nintendo GameCube) console games, in the keep case format in which to this day the vast majority of games are sold. These boxes are sometimes known as Amaray cases, after a popular manufacturer of them. In the U.S., most PC games continue to ship in plastic DVD cases or cardboard boxes, though the size of such boxes has been standardized to a small form factor. Special packages such as a "Collector's Edition" frequently still ship with oversized boxes, or those with a different material, such as a "Steelbook".

In the U.S., the IEMA played a major role in improving, from a retailer's perspective, the way most PC games are packaged. In 2000, many retailers were becoming disenchanted with the salability of PC games as compared with their more profitable console game counterparts as products. Oversized software boxes were blamed for a lack of productivity per square foot (the profitability of a particular item sold at retail based upon its foot print). The IEMA worked with leading game publishers in creating the now-standard IEMA-sized box, essentially a double-thick DVD-sized plastic or cardboard box, which effectively increased the profitability per square foot by over 33% and appeased merchants and developers alike. Medal of Honor: Allied Assault was one of the first PC games in the U.S. to come packaged in this new standardized box.[1][2][3][4][5][6]

In creating the new box size the IEMA found itself in the unlikely position of platform guardian (where each console platform had a first-party publisher to oversee standardization matters, PC games by their very nature did not). As such, the industry pressured the organization to develop a platform identification mark which would unify the display and focus the customer's brand perception. Again the IEMA worked with publishers to create a new standard "PC" icon, and would provide its use on a royalty-free basis to the industry.

In 2004, Half-Life 2 was made available for download over the Internet, via Steam. A physical boxed copy was also sold, though it also required activation over the Internet. Valve Corporation hoped this method of distribution would take off, as it delivers a greater percentage of the sale price to the game developer than boxed copies. Valve's belief was not unfounded, as Steam became the most common method of PC game distribution by late 2009: even earlier, internet distribution surpassed physical, and as of mid-2011 is unchallenged. Many, if not most games by most publishers for the PC, not only Valve, are released as "Steam" electronic copies which regularly outsell physical copies. In addition, Steam's DRM remains one of the most secure available, but is very non-intrusive compared to schemes like SecuROM, which, in installing kernel-mode drivers (often somewhat inaccurately referred to as "rootkits"), are often incompatible with certain hardware configurations and many pieces of third-party security software (such as software firewalls and anti-virus applications), a problem that does not plague Steam. Steam also allows consumers to back up their copy of Half-Life 2 as well as other games that are downloadable through Steam onto CDs or DVDs. To complement this feature many fans have created box coverings for jewel cases that can be downloaded and printed, giving birth to a wide variety of game packaging styles and designs.

Java games for cellphones are distributed almost exclusively via the internet. It is possible that the proliferation of home broadband will lead to electronic distribution for all games in the future, leaving physical packaging a niche market, though game developers cite the unsolved problem of digital rights management as the main barrier to this." (wikipedia.org)