News and Articles
Upcoming Projects Recently Completed Projects Past Projects
Tools Development Platforms Conferences Parody and Fun Sites Links Politics
Teams Books and Journals FAQ Link to Us
|Active Distributed Computing Projects - Internet|
These links take you to other project categories on this site:|
Mathematics Language Art Puzzles/Games Miscellaneous Distributed Human Projects
Collaborative Knowledge Bases Charity
See the bottom of this page for a description of the icons on the page.
|Project Information||Project % Complete||Major Supported Platforms|
Evaluate the performance of large websites to find bottlenecks with
PEER client and its first project, peerReview. Some users will be paid up to
US$45 per month for their contributions. Payments are made via
published in Spring, 2009.
Note: peerReview does its work when a network connection is present. Modem users will notice that the client is not active while they are offline. A work unit is completed in 15 minutes, so it is possible for modem users to contribute useful work. The client supports some firewalls; it does not support users behind proxy servers.
The latest version of the PEER client is available as of April 22, 2009 for Windows. The client allows you to configure how much work your system contributes. PEER users' clients will update automatically.
provides to network researchers "a wealth of
end-to-end data that has been previously unavailable" in order to make the
Internet faster and more reliable.
"NETI@home is an open-source software package that collects network performance statistics from end-systems. ... NETI@home is designed to run on end-user machines and will collect various statistics about Internet performance. These statistics will then be sent to a server at the Georgia Institute of Technology (Georgia Tech), where they will be collected and made publicly available. We believe that this tool will give researchers much needed data on the end-to-end performance of the Internet, as measured by end-users. Our basic approach is to sniff packets sent to and received by the host and infer performance metrics based on these observed packets. NETI@home users will be able to select a privacy level that will determine what types of data will be gathered, and what will not be reported. NETI@home is designed to be an unobtrusive software system that runs quietly in the background with little or no intervention by the user, and using few resources.
"NETI@home is written in C++ and uses the popular Ethereal network analyzer to sniff packets. NETI@home also uses the zlib compression library. ... NETI@home has been developed as a thesis project at Georgia Tech and has been designed in accordance with the CAIDA specifications on network statistics.
"NETI@home includes the NETIMap application, written in Java, to encourage the use of NETI@home. The NETIMap application, when run, will display a geographical map of the world. As Internet hosts are contacted (a website, for example), a dot is placed on that host's calculated coordinates. The coordinates are calculated using CAIDA's NetGeo database."
Warning for version 1.0: a user has reported that he installed NETI@home on a Win2000 SP4 machine, and after uninstalling it, he noticed there was a service running, trying to execute Neti@home every 20 minutes. He disabled the service, but was unable to remove the registry entries because there was always an "Error during delete." NETI@home personnel responded with the following information:
As a temporary solution, I would recommend using Control Panel->Administrative Tools->Services to stop the service and also to have the service disabled at startup. The strange thing is that the service code is located in "Program Files\Ethereal\neti" and if the program has been uninstalled, this folder should not exist or be empty. NETI@home runs as a service to avoid users having to manually start it and so that it can capture data when users are not logged in (just sitting at login prompt). To manually run NETI@home you can run the go.bat file located in "Program Files\Ethereal".
Version 2.0 of the client is available for Windows, Linux, Mac OSX and Solaris as of June 1, 2005.
(Distributed Internet MEasurements & Simulations) study the structure and
topology of the Internet. DIMES is part of
EVERGROW. "The DIMES agent performs
Internet measurements such as TRACEROUTE and PING at a low rate, consuming at
peak 1KB/S. The agent does not send any information about its host's
activity/personal data, and sends only the results of its own
measurements. Aside from giving a good feeling, running the DIMES agent will
also provide you with maps of how the Internet looks from your home
(currently) and will (in the future) provide you with a personalized 'Internet
weather report' and other user-focused features."
DIMES passed 100 million measurements on June 16, 2005.
The client uses very little CPU and data bandwidth. It supports users behind firewalls. It requires you to have Java 1.4 or later installed on your system. Version 0.4.3 of the client is available for testing as of January 3, 2006. Version 0.4.2 of the client is available for Windows. Version 0.4.2b is available for Linux and Mac OSX. Source code for the client is available for download.
View VRML 3D maps of the Internet from the project's Community page.
Join a discussion forum about this project.
29,404 networks and 204,204 links traced;
7,522,460,896 total measurements
Majestic-12 is a distributed World Wide Web search engine. The project's client
software application "crawls" websites to see which sites have changed their
content, and updates a master search index.
The project crawled 1,000,000,000 URLs by September 5, 2005. As of March 17, 2006, the project has 1 billion web pages indexed and searchable. The project discovered its 1 trillionth unique URL on October 12, 2009. The only other organization to publicly announce it has discovered 1 trillion URLs is Google.
Version 1.6.11 of the MJ12node client is available for Windows and Linux as of December 6, 2009. Windows users must have Microsoft .NET version 1.1 or 2.0 installed. Linux users must have Mono installed. An MJ12agent application is also available: it allows you to control an MJ12node client on a separate computer.
A Majestic12 search engine plug-in is available for the Firefox browser as of August 29, 2005.
Join a discussion forum about this project.
184,629,065,290 URLs indexed
Boitho is a
general World Wide Web search engine.
The project's client software application "crawls" website URLs to index
their content, and creates thumbnail images of them for its master search
index. This site is also available in
To participate in the project, download and install BoithoCrawler.exe, then run the BGui executable or use the Start menu: start -> All programs -> BoithoCrawler -> Start Boitho Crawler. When the application starts, move the "Run Nr of crawlers" slider to the right to start 1 or more web crawlers. The BoithoCrawler client is available for Windows.
397,263,160 URLs indexed
is a distributed computing and peer-to-peer
file-sharing project. The site is also available in
. It uses its
participants spare bandwidth and CPU cycles, and rewards the participants
with game and music downloads. Participants receive credit for the results
of the file-sharing and computing that their PeerFactor client completes,
rather than the amount of time they make their computer available, so anyone,
including modem-users, can participate. Participants will be able to redeem
their credits for free downloads from commercial game and music download
services. For now participation is only open to French citizens. An English
version of the client will be available in late 2006, and participation will
open to citizens of all countries when that version is available.
The project currently runs the following applications: peer-to-peer (P2P) sharing of legal promotional content (see some examples); improvement of search engine performance. In the future it will include other applications: benchmarking and monitoring of websites; distributed computing; rendering graphics; weather-related computing.
To participate in the project, download and run PeerFactor.exe. The PeerFactor client is currently only available for Windows. A Linux version should be available soon. The client is not configurable, to make it easier to use. The client runs at idle priority. The client can be used behind a firewall, but the firewall should authorize PeerFactor to use the maimum possible number of ports. The client currently only uses about 5% of available CPU to crawl and index websites for commercial search engines.
(the Dependency Spider) complete its two major goals:
1st: Build up a database containing the dependencies between individual web sites and groups of web sites.
All of the projects results will be made publically available.
As of June 22, 2008, no work units are available for phase-1. The project owners are developing phase-2 and will issue new work units when that phase is ready.
The project uses the BOINC computing platform to run various applications. See the BOINC platform information for the latest version of the BOINC client. Version 5.07 of the project's Spider software application is available for Windows as of December 17, 2006.
Join a discussion forum about this project.
|waiting for phase-2 to begin;
Help Pingdom GIGRIB
measure uptime for websites. Participants in the project monitor the
uptime of websites specified by other project participants, and can specify
their own websites to monitor. The project's "website shows the uptime
information for the thousands of websites that have been added by
[its participants]. The information is public and searchable." The project
is sponsored by Pingdom, which also runs
a commercial, more fully-featured version of the website monitoring service
using Pingdom's own servers. "GIGRIB stands for Green Is Good, Red Is Bad."
See more information
about the project and the project's
To participate in the project, sign up for a free account, then download and run the project's client software application. Version 1.0.129 of the GIGRIB client is available for Windows as of December 11, 2006.
Grub, owned by
Search Wikia Labs,
is attempting to create a better search index than
Google. The project is designed on
"Four Organizing Principles (TCQP) - the future of Internet Search must be based on:
The project, which began independently in 2001, was bought and ended by LookSmart on October 11, 2005. In its first incarnation it was "an open-source distributed Internet crawler." The client "crawled" websites to see which sites had changed their content, and updated a master search index in real-time. grub.org hoped to create and maintain the most comprehensive and up-to-date search index of the Internet ever, and provided update feeds of crawled sites to the public for free and to commercial search engines. LookSmart gave the following reason for ending the project:
Effective October 11, Grub.org, a project of LookSmart, will be shutting down its servers. LookSmart is focused on dedicating our internal resources to meet objectives that are core to our business strategy. As a result, certain elements that are no longer central to the business are being phased out.Wikia bought the project from Looksmart in 2007 and is currently restarting it.
See the project's wiki.
|waiting to begin||
|The following icons may appear in the Supported Platforms section of the table:|