Review on Rethinking the Internet Design by Kahn

The article by David Clark on rethinking the design of internet analyses the original design goals v/s the current scenario of internet. The internet was designed to be based on end-to-end system with intelligence in hosts which communicates and not rely on the underlying architecture of communication. Thus it’s responsibility of the users/system at end to check for validity of data/information, process it/ discard it. The intermediate systems facilitate minimum functionality. This obviously made internet most popular with simple system designs and minimum effort to attach an end node.
The internet was designed and was believed to be used by experts basically for scientific or military applications. But analyzing the current trends and parties involved in internet, this is not the case.
The original assumption of trusted end systems totally vanished
The internet is used now for real time streaming which was not considered in the original design goal
The role of ISP’s for providing the services and involvement of 3rd parties and
Less sophisticated users
The Internet is currently used by all age groups, which means its beneficiaries are not only scientific community. Thus it invites public interests in content. The government and managements need control over the message delivered. Also there are scenarios where uses want a proof of transaction and some other time they want to be anonymous. There are issues in how much one can trust a software or hardware, as applications can monitor user activities and hardware properties can track a user from any part of the world with out knowledge of the user. Also the use of internet for spreading unwanted messages or SPAM or making denial of a service makes end systems to be more and more sophisticated (means it’s more complicated day by day).
The possible ways are
Modify the end node to have settings to control the applications. For eg: adult content to be used by children or tracking the user transaction under law by modifying the browser design. But the monitoring of contents by government takes away the assumption of end-end approach.
We can add functionality to the core by using firewalls, traffic filters and NAT elements. Firewalls are used widely to protect an island of nodes with rest by filtering normally done in network layer. There are application layer filters also such as application proxies. The design of NAT boxes removed the fear of running out of public IPs. But it removed the original design principle that addressing is unchanged during end-end transmission
The operation of ISP can be modified to control the contents passed. But with encryption mechanisms, the 3rd party interests are not preserved.
We can label the content like “Adv” for advertisement messages, or metadata content in the web site.
End-End design can use anonimisers, content filters and content caches for improved performance and less overhead to the end user.
We can use trusted parties service for PKI and content analysis.
We can have non technical solutions such as imposing law and order with cyber laws by modifying the existing ones.
The bad side of internet is that we are getting any kind of data in cheapest possible way. From the experience as a user of internet we see that the number of spam messages and porn advertisements increases day by day. There are reports of more and more cyber crime and issues related to content delivered especially to spread terrorism. We can never trust either the system or the software we are using, the license agreement says “use it as it is and the vendors are not responsible” for the losses caused by using it and if it’s objectionable, it is under court of XXX country and YYY place. News papers describe how internet fraud made loss of property for many or about new culture of social networking and private communication mechanisms. These are in fact the wrong sides of the internet.
The good side is that we are getting any kind of data in cheapest possible way. Again, the internet made possible to exploit the information available all over the world. It made possible , the users to share and collaborate for any kind of work they need, in more efficient way than earlier methods. It made revolution in many areas like human genome project or SETI@home distributed computing models for solving unsolved problems of public interest. It’s now used by all the people irrespective of the issues in it. This shows the success of the internet and computerization in general.
The question is what do we lack or who will improve the situations? Is it the private billion dollar companies or military or government to decide the future of internet or the ways in which it should be used? To answer that, let’s compare it with existing technologies, how they addressed certain issues
With telephone or post, anybody can post any objectionable content in past. They also keep anonymity. People used encryption from long time back in communication medium. The magazines or TV channels deliver any kind of matter including objectionable material. Hardly have they showed any warning to the viewers that it’s meant for adults.
The way in which they were controlled is collective by parties involved. An individual who use TV or magazine is aware of the possible content and prevent it to reach a kid. The organizations both profitable and non profitable works together to address the control of matter delivered through media. The postal and telephone companies have monitoring facilities, which can be used for the law enforcing agencies.
What is not there in internet and what do these systems have? It is the collective control. Currently the internet uses software or hardware which does not have “auditable” property. Also its design is controlled not by non profitable/ law making organizations for public interest. It’s by the corporate with their own vision to make more profit make the standards. ISP’s decide to charge based on the service and not the size of data. It’s so funny like some one charge you a paper with 20 bucks if you use it for essay writing and 2 bucks for the same, if it’s used for covering a material. Instead ISP should look for quality of service like “24 hr uptime, minimum congestion, preservation of offered bandwidth etc. Just like the first postal office and last postal office stamps the messages, which are trustable, we need stamps made by the ISP to make sure that any data delivered is traceable with trust.
The other thing to consider is, who can be the trusted parties? And do we need trusted parties per country/jurisdiction or we can have a consortium
A short set of suggestions for better internet
Trustable Stamping to trace the communication by ISP
Open architecture of system and network auditable by the experts of public interest and non profitable organizations
Making cyber laws in national and international level and co operative efforts by all parties for controlling objectionable material
A person holding world’s most accurate gun is not called secure if he does not know what he holds. So Creating awareness to the users for effective use of the internet than trying to create so called “more secure” machines.
Quality based charging with cost effective service design according to the demand of the society.
Monitoring facilities at ISP level to track terrorism and public threat matters and filtering facilities if needed subjected to court of jurisdiction of the country in which it’s offered.

No comments: