Monday, 14 September 2015

The Revolution Of Gaming

Hey guys, welcome back to another blog post. Today we will talking about the revolution of gaming over the past few years. If you guys don't know what gaming is, well gaming is the running of specialized applications known as electronic games, especially on machines designed for such programs and, in a more recent trend, using personal computers on the Internet in which case the activity is known as online gaming. A person who plays electronic games is called a gamer. The term "gaming" originated as a synonym for "gambling" although most electronic games today do not involve gambling in the traditional sense. Hope you guys enjoy it and please leave a comment about what we should do next. Thank you.


People who don’t consider themselves gamers are well aware that video games have evolved tremendously in the past 20 years. The graphics have gone from pixilated sprites in early computer games to breathtaking scenery that almost looks real. Mute characters have finally found their voice. The budgets are growing, and so are the development times. However, there are a number of other more subtle—but equally important—differences that have arisen in gaming in the past two decades. From broader audiences to the death of arcades, the face of gaming has seen enormous change.











Back in 1993, video games were viewed mainly as a pastime for children. Most of the top-selling games, such as “Sonic the Hedgehog” and “Super Mario World,” were appropriate for people of all ages. Since gaming was targeted at a relatively niche market, carrying the label “gamer” differentiated you from others.









Companies often target older audiences when creating games, at least more so than companies did in 1993. With the exception of Nintendo—which still relies mostly on family-friendly games—many companies rely on games made for adults to reap huge sales. Top-selling series like “Halo,” “Call of Duty,” “Assassin’s Creed,” and “God of War” are all developed with older teenagers and adults in mind. Every console has a wide assortment of games ranging from those that are geared toward adults or children and those that are for everyone.









The video games that are exclusive to a certain console are central to defining it. For instance, Nintendo has long been known by the franchises you can find only on its systems, including “Mario,” “The Legend of Zelda,” and “Donkey Kong.” And ever since “Halo” graced the original Xbox as an exclusive title, the system became known for its strong first-person shooting games.
Although exclusive games are still important, there have been fewer of them in recent times. There’s simply more money to make if the game is on multiple systems, so it shouldn’t be too surprising that many top-selling series like “Call of Duty” are multi-platform. Some franchises that used to be found on only one system have broadened their horizons. “Final Fantasy XIII,” for example, became the first “Final Fantasy” game to be found on Sony and Microsoft systems.









The primary exception to this rule is Nintendo, which still relies heavily on console-exclusive games. Since the Wii was so radically different from its competitors, a game usually either came out only on the Wii or was released on the Xbox 1 and PlayStation 4. To find a game on all three platforms was uncommon.










It should also be noted that, for the most part, multi-platform games today are largely the same no matter which console they’re played on. Back in the days of the Super Nintendo and Sega Genesis, a game could carry the same name on two different systems but have many notable differences. For instance, as the magazine Game Informer has noted, the licensed title “Jurassic Park” was released on both systems. However, the Super Nintendo version blended top-down exploration with first-person sections while the Genesis version was a platformer. Now in 2013, though, differences are not usually notable. One console might offer special support for a certain controller, and another might have some exclusive downloadable content, but the games are fundamentally the same.


Back during the days of the Super Nintendo, we only had one brand new mainline “Mario” game, and that was “Super Mario World.” Although “Yoshi’s Island” was advertised as a sequel to it, the games featured largely different styles of play. It wasn’t until “Super Mario 64,” five years after the release of “Super Mario World,” that another entry in the main Mario series would hit store shelves. Nowadays, though, we’ve had “New Super Mario Bros.,” “Super Mario Galaxy,” “New Super Mario Bros. Wii,” “Super Mario Galaxy 2,” “Super Mario 3-D Land,” “New Super Mario Bros. 2,” and “New Super Mario Bros. U” in the last few years. Ranging from 2006 to 2015, these seven games have come out at a rate of roughly one per year. Critics from websites such as GameSpot.com have stated that the series might be tiring itself out with so many releases.






Other franchises are equally guilty of spilling out a relatively similar game every year. The “Call of Duty” series, for instance, still sells millions, but an increasing amount of gamers want the developers to innovate the franchise. If history repeats itself, the series could end up like the “Tony Hawk Pro Skater” games, whose popularity slowly fizzled out with each passing year.











Even if companies don’t release a new entry to a series every year, sequels are becoming increasingly common. Among the hottest games of 2015 are “Just Cause 3,” “Uncharted 4,” and “Rise Of The Tomb Raider.”









In 1993, arcades were viewed as the forefront of the gaming industry. Many of the best-received console games, such as “Street Fighter II” and “Teenage Mutant Ninja Turtles: Turtles in Time,” began as arcade games. In many cases the arcade games featured additional options that the home versions didn’t. For instance, in “Turtles in Time,” up to four players could play the arcade version. Plus, arcade games such as “Virtua Fighter” had more advanced visuals than you could find on consoles of the day.













However, arcades are now viewed as relics of the past, places off to the side of skating rinks and movie theaters. With the advanced graphics boasted by consoles of today, arcade cabinets no longer have a competitive edge over their home counterparts. Also, since video game budgets are on the rise, it’s awfully difficult to justify making a game that will profit only a few quarters per play. Most arcades today limit themselves to relatively old games for this reason.


This does not, however, mean that gamers have given up on the idea of playing games in short bursts at a low cost. In fact, it can be argued that the app games offered today are like portable arcades. The games are easy to learn and don’t involve huge commitments of time or money.
“App gaming has caused us to expect that we can complete a level in a couple of minutes or that we can save the game at any time,” Matthew Fye, the developer of the app game “Shooting Star,” has commented.
It’s also striking that, to continue playing an arcade game, you had to pay extra money. In many app games, you can unlock additional content by paying extra, as Fye also points out. So, gamers still have access to easy-to-pick-up games when they so desire, even if traditional arcade units are largely a thing of the past.
The two giants competing for your money in video games in 1993 were Nintendo with its Super Nintendo Entertainment System (SNES) and Sega with its Sega Genesis. Both consoles performed well, so the companies looked poised for years to come. However, times began to change. To begin with, Sega made the mistake of launching its next big system, the Sega Saturn, months early by surprise. Before the days of the Internet, it was difficult to get this message out, leaving many gamers and retailers confused. In the eyes of many, this marks the first huge mistake that Sega made.





Even more important, however, was the entry of Sony into the video game industry. At first, Sony wanted to make a system in conjunction with Nintendo. But Nintendo pulled the plug on the project because, according to Edge-Online.com, the companies couldn’t agree on how to split the money made.

According to DidYouKnowGaming.com, Sony then went to Sega, only to have their team up rejected as well. This would be a grave mistake for Nintendo and Sega, as Sony would come to dominate the market with its PlayStation 1 and then its PlayStation 2. This competition severely cut into Nintendo’s market share, and it ultimately forced Sega to drop out of the console-making business, concluding with the Dreamcast’s death in 2001.
















With Sega no longer making consoles, Microsoft decided to unleash a console of its own in 2001—the Xbox. Although it did not receive nearly the same amount of sales as the PlayStation 2, particularly in foreign regions, it cut even further into Nintendo’s market share. Between the PlayStation 2, Xbox, and Nintendo GameCube, the Cube ended up being the weakest seller of the three competing consoles. Today, Microsoft remains a strong competitor with its Xbox 1, which has currently sold slightly more units than the PlayStation 4 worldwide, according to VGchartz.com.











Aside from the specific console manufacturers, it’s important to point out that the hub of video game development shifted from Japan to the United States during the last 20 years. Top-selling games on the SNES included “Super Mario World,” made by Nintendo, and “Street Fighter II,” made by Capcom, both Japanese companies. For the Sega Genesis, top-selling games included “Sonic the Hedgehog” and “Sonic the Hedgehog 2,” both made by Sega itself.


However, consoles today mostly rely on American-made games to garner sales. Series such as “Call of Duty,” “God of War,” “Halo,” and “Guitar Hero” have all been made by U.S. developers. The only exception to this rule is the Nintendo Wii, which has thrived thanks to games made by Nintendo itself. However, the Wii U seems to be making third-party support from American developers a bigger priority, so even Nintendo recognizes the importance of U.S. gaming.




Online gaming is arguably the biggest difference between video games of 1993 and those of 2013. In 1993, gaming was usually a solitary activity. Sure, you could play against someone else, but even a four-player compatible game was rare back then. But now, you can compete with millions of gamers from the comfort of your living room. Whether you’re challenging another person live or comparing high scores, you have a wide variety of ways to test your skills against others. Some massively popular multiplayer, online role-playing games even require you to have an Internet connection to play because they were built from the ground-up with community in mind.






There is also a huge community of gamers on the Internet. Many forums offer places where gamers can chat about the games they love, ask questions, debate certain points and more. People also post videos of themselves playing games on YouTube. Oftentimes, a video is meant to show off an impressive performance, such as a speed run through the game. Other gamers can watch these for their own amusement or to improve their performance. Plus, if a gamer is ever stuck on a certain level, help is available for free just by clicking on a fan-made guide somewhere online.






There has been things that have been adapted into our daily life as many different of video games are being use to trained or teach us. For example, A Swedish school in Stockholm has made Minecraft a compulsory subject for students as teachers are hoping computer game will encourage the children to develop their thinking. Also, U.S  Miltary has started using Xbox 360 controllers for their new laser gun.
Clearly, much has changed in the video game world over the past two decades. We’ve become more careful about keeping youngsters away from violent video games. We’ve seen the rise and fall of companies and the entrance of mobile app gaming onto the scene. Only time will tell how vastly different video games will be in another 20 years.

No comments:

Post a Comment