Nsf Gives Peek At Plans To Overhaul Internet

caroper

Executive Member
Joined
Aug 5, 2003
Messages
8,163
The National Science Foundation (NSF) has given a glimpse of a proposed
initiative to redesign the Internet. Though short on details and
currently without funding, the project, called the Global Environment
for Networking Investigations, is intended to take a clean-slate
approach to designing a new Internet, one that addresses some of the
major shortcomings of the current Internet, including security and the
growing numbers of individual devices that connect to the network.
Increasing transfer speeds is not one of the project's goals. Leonard
Kleinrock, computer scientist at UCLA and one of the developers of
Arpanet, precursor to the current Internet, noted that early developers
of the Internet did not anticipate its current reach and had no reason
to include security as a primary concern. In addition, the network was
not designed to accommodate the vast numbers of mobile and wireless
devices, as well as remote sensors, that now vie for Internet space.
The NSF is seeking participation from other government agencies and
from other countries for the project.
New York Times, 29 August 2005 (registration req'd)
http://www.nytimes.com/2005/08/29/technology/29internet.html
 

Kei

Banned
Joined
Jul 10, 2004
Messages
1,220
Now to comment as I am working with the current protocols:

I disagree as there are not many major shortcomings with the current protocols. Jon Postel tried to be futuristic with these protocols and he did realise that the protocols would be in wide use someday. A lot of the notes in his RFC's seem to suggest a lot of thinking out of the box about these things.

People like myself have found it to be useful in terms of putting embedded systems on the internet such as sensors, controllers, etc and this is basically to enable ease of use and networking practically for free in environments where networks exist. These devices should never be allowed to go onto the www because most of them are embedded systems and have limited resources and cannot be expected to handle a lot of real-world traffic on the 'net. The iBurst network, with all it's "port scan" stuff is a real case where embedded systems get flooded with "trash" packets and hang!

The security issue comes down to lack of responsibility. This is where some of the internet standards need work, not the protocols. Some of the standards should be reworked in order that accountability and legal aspects are taken into account.
 

gkm

Expert Member
Joined
May 10, 2005
Messages
1,519
IPv6 provides a lot more options for security headers in packets etc. and has so many addresses that each person currently alive can have their own network the size of the internet. So, while it is good to do research on future networks, like Kei, I am also not sure if the focus should be on something to replace the internet. Not sure if it would be feasible to have something that is not compatible with the current internet being adopted for the forseeable future.
 
Top