Wednesday, November 23, 2005

BT: Making the Internet work better

zdnet.co.uk
Cath Everett
ZDNet UK
November 23, 2005, 10:15 GMT

Peter Hovell heads up the BT wing charged with improving the Internet, which includes the telco's multi-billion pound 21st Century Network project


The marketing mission statement at BT's Network Research Centre reads: "The NRC vision is a global network that can be accessed from any device, anywhere, whilst providing security, appropriate service quality and being economic to deploy and operate".

Peter Hovell, who heads up the unit, has a slightly cleaner vision of his group's ultimate mandate. "If we can make the Internet a nicer experience and provide a range of quality applications, whether they be voice, video or whatever, the Net will become more what people wanted — a nicer place where applications just work and are more instantaneous as networks get faster, and where data will appear as required," he says.

NRC is based in Adastral Park in Ipswich, along with BT's other research centres, and has been in existence since the organisation was part of the Post Office. At this stage, its research focused on enhancing PSTN networks, but this has now shifted to the Internet and its associated protocols and technologies.

Making it all work better
To achieve this aim of "making it all work better", BT has developed two parallel projects that will provide the foundation for the overall vision of a more efficient network: the 21st Century Network (21CN) and New Internet Architectures (NIA).

The 21CN is a five-year long initiative in which BT will replace its copper circuit-switched networks with a single IP-based core infrastructure, based on optical fibre. The aim is to provide customers with a single network, using both fixed and wireless links, to access broadband-based voice and data services from anywhere in the country and using any number of devices. The scheme is expected to be completed by 2009 and will cost a total of £10bn.

But while Hovell's researchers are "generally looking at how networks will evolve in the future", they are also investigating how to optimise performance at an access level rather than simply at the core network.

"Access networks such as DSL are always increasing in speed, but there comes an economic limit if you want to use different technologies to improve the operational costs of running and maintaining fibre, and if you want to add new services and facilities," says Hovell. "So we're quite interested in passive optical networks where there's one fibre from the network and we use splitters to deliver the signal to many people."

Hundreds of signals
While each home or business could be provided with its own fibre line, this is not only an expensive option, but one that also results in "potential termination problems in exchanges because there are so many cables coming in".

Having only one fibre in the network and splitting it into hundreds of signals, however, currently "looks like the more feasible and economic deployment", says Hovell. Radio technology, on the other hand, could be a suitable choice "for mobility in the home and home distribution".

"Because it would be a shared infrastructure with several hundred users, each would have a moderate bandwidth of say 10Mb per second, but when people aren't using it, which is a considerable amount of time, we could steer it to others that require it. So there's the potential to burst up to Gigabit per second downloads," said Hovell.

This sort of service would enable customers to download music and video in seconds, provide for network back-up of home videos and support thin client access to remote applications, particularly for small businesses that currently experience difficulties in maintaining their own PCs.

DVD-quality audio
But the work that is going into the IP-based 21CN is also feeding into the NIA project, which involves updating the fundamental structure of the Internet "to make it a better and safer place to be".

"The Internet was invented many years ago by academia, but when the inventors worked on it originally, there was no concept of economics or bad guys. They invented an architecture that was a free-for-all, so everyone could share information and everything was nice and kind," says Hovell. "But the world has changed, the use of the Internet has changed and, fundamentally, the architecture has not evolved to take account of that."

In an attempt to kick-start this evolution, BT is working on technologies that can be deployed incrementally to improve the way the Internet functions. One such initiative focuses on developing a feedback mechanism to alleviate congestion-related performance problems.

Trafficmaster
"If you're driving a car, you know when you've been stuck in a traffic jam, but you don't know when the next one is going to happen. However, if you introduce Trafficmaster, it will tell you what's coming up," Hovell explains. "It's the same analogy with the Internet. An Internet packet won't know when the network is congested, but a feedback mechanism can predict when the packet is going to a congested node and hence can alter its routing dynamically."

Nonetheless, he describes such technology as a "mid- to long-term solution" and does not expect it to appear commercially for another five to ten years. "It's still very early days in the research cycle and, although it's already got considerable traction in the academic and Internet world, it takes much longer to standardise things like this and have them built into equipment. Take VoIP — it's been around for years, but it's only just starting to happen," he says.

21CN
Two other key areas of focus that run across both the 21CN and NIA programmes, however, revolve around quality of service (QoS) and security, although Hovell was not prepared to talk about the latter due to its sensitive nature. The focus here is not on creating "stovepipe solutions that only work on the core or access networks, but end-to-end solutions that are also cheap to deploy and run", he explains.

To illustrate the point in QoS terms, BT has introduced the concept of Guaranteed Quality of Service Synthesis (GQS). This involves tackling congestion issues on the Internet, but this time with the idea of providing an engaged signal if it is busy — a mechanism that is likely to become more important as networks increasingly carry voice and video traffic.

Abnormal loads
While Hovell acknowledges that the current over-provisioning of the Internet means that the network currently fails only rarely — as a result of congestion, equipment faults or abnormal loads — there is less and less tolerance for even the smallest interruption to service. "We do need a quite cheap solution, but one that also works, so we developed one called GQS. This surrounds the core network with gateways and measures congestion on the actual path in real-time to decide whether to admit a call or not, rather than relying on a centralised device adding up individual bits of bandwidth," he says.

Three NCR researchers submitted draft specifications on GQS to a meeting of the Internet Engineering Taskforce in Canada in November 2005, and the hope is that the technology could start appearing in commercial equipment in between two to five years' time.

Source here

No comments: