Intel, Asus, Cooler Master, Corsair Memory, Dell, Compaq, Gigabyte, Mercury, zebronics, AMD, Nexus, Delta, IBM, HP, Apple, Acer, BenQ, Sony, Samsung, LG, Philips, Transcend, Nvidia, SiS, Logitech, Alps Electric Corporation, Creative Technology, ASRock, Asus

Blog Archive

Tuesday, 27 December 2011

In Depth: 10 ways PCs will change over the next 25 years

In Depth: 10 ways PCs will change over the next 25 years

The PC in 25 years

The best way to predict what the future holds, they say, is to look to the past, but such a philosophy isn't necessarily the best option when it comes to computers. It's a useful way of extrapolating the numbers to see how fast the processors of the future may be; that's one reason Moore's Law continues to work.

We can even use it to predict how much RAM future machines will have access to and how big hard drives are going to get, but given that the biggest changes to computers come in the way we use them, any predictions of the future would be better left to futurologists, industry wishlists and brief glimpses of roadmaps.

You need only look at the rapidly changing way in which we use our computers to see that it's not just technological advances that have pushed the PC to centre stage in our lives.

The PC has left its expensive, exclusive, elitist origins. Instead of being reserved for education and the workplace, as was once the case, PCs can now be found in nearly all our homes. We've built our whole lives around the things these machines enable us to do.

We use them for everything from storing home movies, music and photos, through to planning trips and holidays, socialising, online shopping, gaming, finding a cure for cancer and searching for extraterrestrial life.

The PC has come a very long way in 25 years. So what does the future hold? How do we get there? Here are the ten advances that we believe will govern the shape of the PC in 25 years' time.

1. Access everything, everywhere

The move to cloud computing has already begun. The idea of everything being held and worked on centrally will have become entirely natural by 2036. That said, there are some obvious aspects that will need to be raised before this can happen, like security and the underlying infrastructure. But these will be addressed, and soon.

You only need to look at the likes of Microsoft Office 365 and Google Documents to see that this is a revolution in full swing, and one that benefits anyone who needs to access the same data and files from a variety of sites and machines.

Everything from office work to games is a potential distributed target, as long as the infrastructure is perfect. And when that happens, there's nothing to stop us having entire operating systems in the cloud - virtualisation taken to the next level, if you will.

The immediate knock-on effect of such a connected society is that the power needn't be at the local level as it is now, but rather at the server level. This regression to client-server architecture has already begun.

As Intel's Graham Palmer puts it, 'Every 600 smartphones or 120 tablets drives demand for a new server.'

2. Watch what you want

netflix

In a world where everything is stored in the cloud, localised storage will still have a use, but it will mainly be used as a backup for your own content, or as a cache for your most accessed files. We can also see this linking to a legitimate filesharing system, which will help offload demand on the more popular servers.

Content that we currently consume, like movies and music, won't ship on physical media. Instead, subscription services will offer access to the latest releases as well as an ever-growing back catalogue. The obvious upside of this is what you'll be able to watch what you want wherever you are, and in whatever form is best for the device you're playing it back on.

Premium services catering to more esoteric tastes and higher bitrate versions would be obvious differentiators here, although storage and bandwidth won't be an issue by 2036, so ultra high definition television (UHDTV) will be the norm, defining the standard resolution of 7,680 x 4,320 at the very least.

Netflix, one of the biggest video-on demand services in the world, has already predicted that DVD rental sales will peak as early as 2013, when it expects its instant streaming service to take over.

3. Enjoy painless security

The brave new world of cloud computing means a fundamental change in the way we view security. Big business has already voiced serious concerns about the ability to ensure mission-critical information is kept private, while the idea that anyone could potentially hack into everything you own could well slow the adoption of cloud computing before it hits its stride.

While not directly related, recent hacking attempts by Anonymous and LulzSec highlight how unprofessional some businesses are when it comes to security, and how poor they are at maintaining server updates. In order to live our computing lives in the cloud, we'll need to be able to log into such services smoothly and securely.

An overall standard to do this would make this process easier for the end user, but at this stage developers are still fighting against each other, and it's currently impossible to see how we advance from our current swathe of sometimes insecure methods to something that's more much workable and universal.

The old-guard security firms are now setting up in the cloud as well, offering services that scan transactions directly in the cloud. Better security needs to happen, even if the future is hazy.

4. Super-powered, super-portable

low power i5

The word 'computer' refers to a lot more than just a stationary system tethered to power socket. You only need to look at the model changes in the likes of Apple to see where the future of computing lies: super thin laptops in the main, with smartphones also doing more and more work and desktops increasingly reserved for specific power-hungry applications like gaming and heavyweight professional tools.

Basic consumer PCs will be limited to compact all-in-one systems built into the back of the screen - a trend that's already being pushed in highstreet computer shops. In 25 years' time, your whole PC may even be pocketable - a tiny USB stick-sized device that you can plug directly into your monitor, your smartphone, or any number of public screens.

The migration of more and more features traditionally associated with the motherboard chipset onto the processor itself is already starting to pay dividends the efficiency front, and we should ultimately see everything shifting into the CPU - including RAM, storage and networking interfaces.

Upgrading an existing machine will be rendered pointless, but with dropping costs, the era of the truly disposable computer will one day be upon us.

5. Forget about recharging

The power requirements of all portable computers will continue to drop, with the less-than-1W device capable of granular levels of self power management that mean only the silicon needed for a specific task is on.

This is an obvious extension of what mobile phones and mobile processors can already do, but pushed to an even greater degree where the device's default state is to be completely asleep. The aforementioned computer on a chip designs mean that this will be necessary to stop subsystems drawing power when it isn't needed.

Improvements in battery technology, with different materials, should see the fabled 24-hour computing becoming not a one-off rarity, but the norm. Battery designs that are capable of offering different voltages and loads to different areas of the chip can improve efficiency, while price drops and efficiency rises in solar technologies should mean that we can recharge such devices while we're using them.

A standardised charging system that enables tomorrow's devices to be charged using a power efficient induction system means that pads next to our workstations, in our homes and in our (flying) cars will top up the battery charge without requiring the numerous cables that currently plague our battery lives.

Five more PC advances to expect

6. Turn anything into a display

tablet

Screens in our current devices tend to define their size. This is true of everything from laptops to tablets to mobile phones to all-in-one desktop PCs. For stationary machines this isn't much of an issue, but for mobile computers the inclusion of screens tends to constrain their portability and usefulness.

Devices like the Sony Tablet S2 attempt to resolve this problem through the inclusion of a folding split screen, while other smaller devices are designed to be used either on their own, or in conjunction with a larger screen for the best effect.

The latest generation of tablets, for instance, boast mini-HDMI connectors that let each device use a large television or projector. In the future, advanced forms of projection should enable some far more advanced usage patterns.

We've already seen smaller standalone projectors that are as small as a pack of cards, while Sony has recently released the Handycam HDR-PJ10, camcorder that boasts an integrated LED projector. Future units will feature brighter lamps with shorter throws, reduced battery consumption and more useful angles.

Digital whiteboard technology will come as standard in the future, so you'll be able to interact with anything you display. Thanks to the reduction in price of high-res panels, certain surfaces will act like TFT screens; plug your mini computer into the wallpaper and you've got a 30ft display.

7. Interact with PCs naturally

Science fiction has a tendency to define our technological aspirations, and has even had an effect on what is researched and implemented. Flying cars, holographic chess and sentient artificial intelligence have all been addressed by literature and movies, and are being actively researched too - however impractical they might be.

But one area where PCs are still struggling to catch up with sci-fi - and aging sci-fi, at that - is the way we interact with our computers. Natural speech input is possible today, but it's not commonplace and it's far from flawless.

The idea of having to train the software to recognise your voice by speaking a list of words isn't appealing to many people either. The accuracy of Google's voice search on mobile phones shows that untrained speech recognition is possible, although this is processed in server farms, showing that a lot of grunt is needed to do things properly.

Given another 25 years of processor development, this shouldn't be an issue. Unfortunately there's not much that can be done about the awkward embarrassment that comes with talking to a computer.

One key ingredient of human communication is body language, and it's possible that the inevitable successors to Microsoft's Kinect could provide the missing element that would let speech recognition draw cues about subtle meanings in words. Combined with gesture inputs, the days of telling your computer what to do shouldn't remain a dream for ever.

8. Better than Pixar graphics

Witcher 2 graphics

If we had to pick the area where the physical makeup of the PC has changed the most in the last 25 years, we'd be hard pushed to beat the advances in 3D acceleration. The first consumer 3D graphics card of note, the 3Dfx Voodoo, appeared 15 years ago and changed the face of gaming, and there have been hints that it could change the interfaces of tomorrow as well.

Advances like VRML may have struggled, but with GPUs now making it into CPUs, the market for 3D interfaces has grown and is ripe for exploiting. Budget graphics cards are going to struggle for significance in the next few years as the graphics capabilities of the APU/CPU increase, but at the higher end of the market we can expect graphics cards to continue to push the envelope for realistic rendering.

Cinema-level rendering techniques like true sub-surface scattering, deep shadow maps and ambient occlusion require significant processing power even on simple models, but as the polygon counts hit the billions and screens use UHDTV resolution, more advanced thousand-core GPUs will be needed.

With no obvious let up in the ongoing and productive battle between AMD and Nvidia, there's no reason to doubt that in 25 years' time the lines between cinema, 3D gaming, and even desktop interfaces will be blurred to the point of non-existence. Whatever else happens, the future is certainly going to be beautiful.

9. Game on the go

OnLive

The concept of the PC as a gaming platform has survived assaults from many angles over the last 25 years. Gaming PCs have responded well to these attacks, with improved graphics, higher resolutions and ever-advancing processors to make them arguably the best platform currently available for gaming - particularly compared to the static world of consoles.

However, it seems unlikely that the gaming desktop will survive another quarter century unscathed. One of the biggest threats is forming right now with streaming gaming services like OnLive and Gaikai.

By 2036 the teething problems of such technologies should have been resolved, letting you play high-end games on pretty much any hardware - PC, phone, console and so on. This means you can start a game during your lunch break at work, play it on the train on the way home, and finish it in front of your television.

And with ubiquitous access, everyone should be playing against each other anyway. The only problem with these services, and that is the inherent latency of transferring user actions to the servers.

This will mean that so-called 'twitch gamers' who enjoy first person shooters will need advanced networking technology; a separate internet channel for super-low latency transfers will be absolutely essential.

10. Be recognised everywhere

We're used to being targeted by specific advertisers based on our previous activities online through the use of tracking cookies, but extending this to the real world isn't such an outlandish idea.

The idea was best visualised by 2002 film Minority Report, but this isn't science fiction. Tests carried out five years ago in Tokyo used RFID tags to create a user-centric advertising environment. The test promoted offers to users' mobile phones as they walked near specific shops, potentially making those offers more attractive if the recipients didn't show interest.

Replace RFID tags with rudimentary facial recognition and the system is no longer an opt-in experience. Advertisements directed at you based on what you've been doing could soon get tiresome, but at least it means you won't be bombarded by adverts that aren't of any interest or relevance to you. Enjoyed that rollerball match? Why not go to another, or download the footage of the game you attended?

A key concept behind this is the notion that is often termed the 'internet of things', which describes the connections between various electronic devices, and what data they have access to. It's a concept that many futurologist are convinced will be necessary to tie all the devices together to produce a coherent technological future.

There is a potential danger here though, as outlined by futurologist Ian Peasons: "If the internet of things is not done properly you can just end up with a 1984-type surveillance state." We don't see that happening in 2036, but the potential is there.



0 comments:

Post a Comment

TOP PRODUCTS

Related Posts Plugin for WordPress, Blogger...
Design by ROCKY| computer hardware by ROCKY