News and Articles
Upcoming Projects Recently Completed Projects Past Projects
Tools Development Platforms Conferences Parody and Fun Sites Links Politics
Teams Books and Journals FAQ Link to Us
|Active Distributed Computing Projects - Internet|
These links take you to other project categories on this site:|
Mathematics Art Puzzles/Games Miscellaneous Distributed Human Projects
Collaborative Knowledge Bases Charity
See the bottom of this page for a description of the icons on the page.
|Project Information||Project % Complete||Major Supported Platforms|
Evaluate the performance of large websites to find bottlenecks with
PEER client and its first
Some users will be paid up to $45(US) per month for their contributions.
The project also began a "Last Mile" Monitoring project in November, 2003.
newsletter was published in September, 2004.
Note: peerReview does its work when a network connection is present. Modem users will notice that the client is not active while they are offline. A work unit is completed in 15 minutes, so it is possible for modem users to contribute useful work. The client supports some firewalls; it does not support users behind proxy servers.
Version 5.0 of the PEER client is available as of November 22, 2003 for Windows. The client allows you to configure how much work your system contributes. PEER users' clients will update automatically (or you can download the client software).
Note to previous users: to upgrade to the latest version of the software, remove your old client from your system, then download the new version and install it:
Note to Win98SE users: in version v5.0, Build 900, of the client, the most recent version, JAVA.EXE appears to saturate the CPU. Gómez has been notified of the problem.
Grub, owned by
LookSmart, is "an
open-source distributed Internet crawler." The client "crawls" websites
to see which sites have changed their content, and updates a master search
index in real-time. grub.org will create and maintain the most comprehensive
and up-to-date search index of the Internet ever, and will provide update
feeds of crawled sites to the public for free and to commercial search engines.
Version 2.3.0 of the client is available for Windows as of May 26, 2004. It includes many improvements. Version 1.0.5 of the client is available for Linux as of January 28, 2003. Bug fixes and news features for each version are described in the ChangeLog.
Join a discussion forum about this project.
Capacity Calibration tests and monitors website performance in real-time with controlled
capacity loads. Participants are paid $0.30(US) per hour of testing. Payments
are made via PayPal: users are paid when
their earnings total $1(US) or more.
Participants must have a full-time, high-speed Internet connection and must run the CapCal Java client application at least 10 hours per day or 70 hours per week. Under Windows 98 and Windows ME, the client only runs when the screen-saver is active. Under Windows 2000 you can configure it to run any time or only during specific times. You can register to be a participant here.
See white papers and a document called "Choking Big Bertha - the Art and Science of Web Capacity Measurement", written by the founder of CapCal. These documents were the beginning of the CapCap project.
provides to network researchers "a wealth of
end-to-end data that has been previously unavailable" in order to make the
Internet faster and more reliable.
"NETI@home is an open-source software package that collects network performance statistics from end-systems. ... NETI@home is designed to run on end-user machines and will collect various statistics about Internet performance. These statistics will then be sent to a server at the Georgia Institute of Technology (Georgia Tech), where they will be collected and made publicly available. We believe that this tool will give researchers much needed data on the end-to-end performance of the Internet, as measured by end-users. Our basic approach is to sniff packets sent to and received by the host and infer performance metrics based on these observed packets. NETI@home users will be able to select a privacy level that will determine what types of data will be gathered, and what will not be reported. NETI@home is designed to be an unobtrusive software system that runs quietly in the background with little or no intervention by the user, and using few resources.
"NETI@home is written in C++ and uses the popular Ethereal network analyzer to sniff packets. NETI@home also uses the zlib compression library. ... NETI@home has been developed as a thesis project at Georgia Tech and has been designed in accordance with the CAIDA specifications on network statistics.
"NETI@home includes the NETIMap application, written in Java, to encourage the use of NETI@home. The NETIMap application, when run, will display a geographical map of the world. As Internet hosts are contacted (a website, for example), a dot is placed on that host's calculated coordinates. The coordinates are calculated using CAIDA's NetGeo database."
Warning: a user has reported that he installed NETI@home on a Win2000 SP4 machine, and after uninstalling it, he noticed there was a service running, trying to execute Neti@home every 20 minutes. He disabled the service, but was unable to remove the registry entries because there was always an "Error during delete." NETI@home personnel responded with the following information:
As a temporary solution, I would recommend using Control Panel->Administrative Tools->Services to stop the service and also to have the service disabled at startup. The strange thing is that the service code is located in "Program Files\Ethereal\neti" and if the program has been uninstalled, this folder should not exist or be empty. NETI@home runs as a service to avoid users having to manually start it and so that it can capture data when users are not logged in (just sitting at login prompt). To manually run NETI@home you can run the go.bat file located in "Program Files\Ethereal".
is creating an Internet search index through a distributed
website-crawling client based on the Microsoft .NET architecture. You can
run the client to crawl websites and contribute to the
search index. For now the project is
only indexing pages related to distributed computing.
To participate in the project, download the client, then download the Microsoft .NET runtime library if you don't already have it installed. The client is in public beta test as of May 31, 2004. Version 1.000 of the client is available as of May 31, 2004.
334 URLs crawled per day, on average
(Distributed Internet MEasurements & Simulations) study the structure and
topology of the Internet. DIMES is part of
EVERGROW. "The DIMES agent performs
Internet measurements such as TRACEROUTE and PING at a low rate, consuming at
peak 1KB/S. The agent does not send any information about its host's
activity/personal data, and sends only the results of its own
measurements. Aside from giving a good feeling, running the DIMES agent will
also provide you with maps of how the Internet looks from your home
(currently) and will (in the future) provide you with a personalized 'Internet
weather report' and other user-focused features."
The client uses very little CPU and data bandwidth. It supports users behind firewalls. It requires you to have Java 1.4 or later installed on your system. The client is only available with a Windows installer for now. Source code for the client is available for download.
View VRML 3D maps of the Internet from the project's Community page.
Join a discussion forum about this project.
14,463 networks and 39,226 links traced
|The following icons may appear in the Supported Platforms section of the table:|