BOINC Credit System
Within the BOINC platform for volunteer computing, the BOINC Credit System[1] helps volunteers keep track of how much CPU time they have donated to various projects. This ensures users are returning accurate results for both scientific and statistical reasons.
Purposes for a credit system
[edit]Online distributed computing relies heavily, if not entirely, on volunteer computers. For this reason, projects such as SETI@home and other BOINC projects depend on a complicated balance among long-term users and the cycle of new users and retiring users.[2]
Reasons for participation
[edit]- Efficiency: Using a computer's resources which would other be idle.
- To contribute to scientific research in general.
- To advance a specific field
- To stress test computers, as with overclocking.
- For competition's sake (by joining a team).
- Some individuals and teams run numerous computers, with some dedicated specifically to BOINC in hopes of climbing to the top of the world charts. In this wise some teams are institutional, e.g. universities and research centers.
- For personal benefit and recognition.
- Projects such as NASA's PlanetQuest plan on allowing individuals to name those planets discovered using their computers
- Projects such as BURP, and Leiden Classical allow users to submit their own operations for use in the system. BURP allows a user to submit models to be rendered, and Leiden Classical allows users to submit physics calculations.
- Cryptocurrency projects such as Gridcoin have their proof of work reward tied to BOINC credits.
Note that this reasons are not mutually exclusive.
Cobblestones
[edit]The basis for the BOINC credit system is the cobblestone, named after Jeff Cobb of SETI@home. By definition, 200 cobblestones are awarded for one day of work on a computer that can meet either of two benchmarks:
- 1,000 double-precision MFLOPS based on the Whetstone benchmark
- 1,000 VAX MIPS based on the Dhrystone benchmark
The actual computational difficulty needed to run a given work unit is the basis for the number of credits that should be granted. The BOINC system allows for work of any length to be processed and have identical amounts of credit issued to a user. In so doing, BOINC uses benchmarks to measure the speed of a system, combining that figure with the amount of time required for a work unit to be processed. The interface then can “guess” at the amount of credit a user should receive. Since systems have many variables, including the amount of RAM, the processor speed, and specific architectures of different motherboards and CPUs, there can be wide discrepancies in the number of credits that different computers (and projects) judge a user to have earned.
Most projects[which?] require a consensus to be reached by having multiple hosts return the same work unit. If they all agree, then the credit is calculated and all hosts receive the same amount regardless of what they asked for. Each project can use their own policy depending on what they see is best for their specific needs. In general, the top and bottom claimed credits are dropped and an average[citation needed] of the remaining is taken. However, certain other projects[which?] award a flat amount per work unit returned and validated.
Total credit
[edit]Credits are tracked internally for computers, users, and teams. When a computer processes and returns a work unit, it receives no credit for that action alone. It must first have that work unit validated by the given, project-specific method. Once validated, the computer is granted credit, which can be less than, equal to, or greater than what was requested. This amount is immediately added to the computer, user, and team total. If a work unit is returned past the given deadline (in most cases) or is found to be inaccurate, it is marked as invalid and results in no credit. Users and teams commonly determine world rank by comparing the total number of credits accumulated. This highly favors users and teams that have been around for the longest time. This makes it extremely difficult for new users to rapidly gain ground in the rankings, even if they are running many computers. That said, given the exponential increase in computing power of the average PC, it is relatively easy to surpass inactive BOINC users who have earned all of their points on obsolete machines –even if they were at one time ranked highly. Thus, the highest ranked BOINC users will generally be the ones who are actively crunching.
Recent average credit
[edit]To find the useful amount of work provided by a computer, a special calculation called recent average credit (RAC) is used. This calculation is designed to estimate the number of credits a computer, user, and team will accumulate on an average day.
Controversies
[edit]The credit allocation has been challenged for several projects like EON[3] and Asteroids@Home.[4] These concerns have led to the shut down of many such projects over time and have also led to several alternative allocation strategies.[5]
Third-party statistics sites
[edit]BOINC projects export statistical information in the form of XML files, and make it available for anyone to download. Many different third-party statistical websites have been developed to track the progress of BOINC projects. These sites track computers, users, teams, and countries within individual projects and across many projects. Many different sites provide summary graphics, which can be used on web pages that automatically update to contain the statistical information for the specified user or team:
- BOINCstats.com by Willy de Zutter
- BOINC Combined Statistics by James
- Free-DC Stats by Bok
References
[edit]- ^ Awan, Malik Shahzad K.; Jarvis, Stephen A. (18 October 2012). "MalikCredit - A New Credit Unit for P2P Computing". 2012 IEEE 14th International Conference on High Performance Computing and Communication & 2012 IEEE 9th International Conference on Embedded Software and Systems. pp. 1060–1065. doi:10.1109/HPCC.2012.155. ISBN 978-1-4673-2164-8. S2CID 14914817. Retrieved 27 July 2022.
- ^ Korpela, Eric J. (2012-05-30). "SETI@home, BOINC, and Volunteer Distributed Computing". Annual Review of Earth and Planetary Sciences. 40 (1): 69–87. Bibcode:2012AREPS..40...69K. doi:10.1146/annurev-earth-040809-152348. ISSN 0084-6597. Archived from the original on 2021-03-08. Retrieved 2021-02-09.
- ^ Chill, Samuel T; Welborn, Matthew; Terrell, Rye; Zhang, Liang; Berthet, Jean-Claude; Pedersen, Andreas; Jónsson, Hannes; Henkelman, Graeme (2014-07-01). "EON: software for long time simulations of atomic scale systems". Modelling and Simulation in Materials Science and Engineering. 22 (5): 055002. Bibcode:2014MSMSE..22e5002C. doi:10.1088/0965-0393/22/5/055002. ISSN 0965-0393. S2CID 13990151.
- ^ Ďurech, J.; Hanuš, J.; Vančo, R. (2015-11-01). "Asteroids@home—A BOINC distributed computing project for asteroid shape reconstruction". Astronomy and Computing. 13: 80–84. arXiv:1511.08640. Bibcode:2015A&C....13...80D. doi:10.1016/j.ascom.2015.09.004. ISSN 2213-1337. S2CID 15706262.
- ^ Estrada, Trilce; Flores, David A.; Taufer, Michela; Teller, Patricia J.; Kerstens, Andre; Anderson, David P. (December 2006). "The Effectiveness of Threshold-Based Scheduling Policies in BOINC Projects". 2006 Second IEEE International Conference on e-Science and Grid Computing (E-Science'06). Amsterdam, the Netherlands: IEEE. p. 88. doi:10.1109/E-SCIENCE.2006.261172. ISBN 978-0-7695-2734-5. Archived from the original on 2021-04-28. Retrieved 2021-05-12.