Engineer 8-5-4-16 Dense Graph Paper: Fill & Download for Free

GET FORM

Download the form

How to Edit and draw up Engineer 8-5-4-16 Dense Graph Paper Online

Read the following instructions to use CocoDoc to start editing and signing your Engineer 8-5-4-16 Dense Graph Paper:

  • Firstly, look for the “Get Form” button and click on it.
  • Wait until Engineer 8-5-4-16 Dense Graph Paper is ready to use.
  • Customize your document by using the toolbar on the top.
  • Download your finished form and share it as you needed.
Get Form

Download the form

The Easiest Editing Tool for Modifying Engineer 8-5-4-16 Dense Graph Paper on Your Way

Open Your Engineer 8-5-4-16 Dense Graph Paper Instantly

Get Form

Download the form

How to Edit Your PDF Engineer 8-5-4-16 Dense Graph Paper Online

Editing your form online is quite effortless. There is no need to install any software via your computer or phone to use this feature. CocoDoc offers an easy tool to edit your document directly through any web browser you use. The entire interface is well-organized.

Follow the step-by-step guide below to eidt your PDF files online:

  • Browse CocoDoc official website on your laptop where you have your file.
  • Seek the ‘Edit PDF Online’ option and click on it.
  • Then you will open this free tool page. Just drag and drop the PDF, or append the file through the ‘Choose File’ option.
  • Once the document is uploaded, you can edit it using the toolbar as you needed.
  • When the modification is completed, press the ‘Download’ button to save the file.

How to Edit Engineer 8-5-4-16 Dense Graph Paper on Windows

Windows is the most conventional operating system. However, Windows does not contain any default application that can directly edit PDF. In this case, you can install CocoDoc's desktop software for Windows, which can help you to work on documents effectively.

All you have to do is follow the steps below:

  • Install CocoDoc software from your Windows Store.
  • Open the software and then append your PDF document.
  • You can also append the PDF file from OneDrive.
  • After that, edit the document as you needed by using the various tools on the top.
  • Once done, you can now save the finished form to your device. You can also check more details about how to edit a PDF.

How to Edit Engineer 8-5-4-16 Dense Graph Paper on Mac

macOS comes with a default feature - Preview, to open PDF files. Although Mac users can view PDF files and even mark text on it, it does not support editing. By using CocoDoc, you can edit your document on Mac quickly.

Follow the effortless instructions below to start editing:

  • In the beginning, install CocoDoc desktop app on your Mac computer.
  • Then, append your PDF file through the app.
  • You can upload the PDF from any cloud storage, such as Dropbox, Google Drive, or OneDrive.
  • Edit, fill and sign your template by utilizing this amazing tool.
  • Lastly, download the PDF to save it on your device.

How to Edit PDF Engineer 8-5-4-16 Dense Graph Paper via G Suite

G Suite is a conventional Google's suite of intelligent apps, which is designed to make your work faster and increase collaboration within teams. Integrating CocoDoc's PDF document editor with G Suite can help to accomplish work handily.

Here are the steps to do it:

  • Open Google WorkPlace Marketplace on your laptop.
  • Look for CocoDoc PDF Editor and download the add-on.
  • Upload the PDF that you want to edit and find CocoDoc PDF Editor by choosing "Open with" in Drive.
  • Edit and sign your template using the toolbar.
  • Save the finished PDF file on your computer.

PDF Editor FAQ

What are the valid facts in the NGT v/s Art of Living case?

There have been many arguments both for and against the World Culture Festival 2016 being conducted on the Yamuna plains. The National Green Tribunal alleges that the event has destroyed the river and the environment.Let us go through a detailed analysis of the various important claims put out by the National Green Tribunal’s (to be referred as NGT hereafter) Expert Committee alleging damage to the Yamuna Floodplains post the World Culture Festival, 2016 conducted by the Art of Living Foundation.I would like to present before the readers a statistically unbiased and scientifically backed representation of this case which is currently sub-judice.Land Description:Before we go further, it is essential that we first geographically define the area over which the World Culture Festival was conducted (From 11th March –13th March 2016)The land parcel is a finite piece of land over the Yamuna floodplain bound by the DND Flyover to its South; Barapulla Drain to its North;River Yamuna to its East and Ring Road to its West.Area ~ 25 hectaresCan be located on WGS (World Geodetic System) 84 coordinates 24 deg 34’55’’N and 77 deg 16’43’’EHere is the detailed image categorically bifurcating the various land sites w.r.t it’s usage for the event(Source: Google Earth, 15th of March ,2016)2. It is very essential to draw the following conclusions from the above satellite images:3-Permanently ramps existed since 2008 (Having bituminous pavement,with potholes and degenerating bituminous overlay).7-Area marked by the purple region previously had mounts of construction waste (malba) solid waste,which has been cleared for the WCF 2016 event by the Art of Living foundation at its own expense.8-Unpaved earthen road running parallel to the Barapullah drain,which is in existence at least since the year 2000,used for vehicles and earth-movers engaged in drain cleaning and slit removal in this section of Barapullah drain.The temporary installations and the make-shift stage are also clearly visible from the satellite imagery.Fig 1:Permanent Ramp 1 having bituminous pavement existed at least since Jan 2008Fig 2:Permanent Ramp 2 having bituminous pavement existed at least since Jan 2008Now, let us go and categorically visit each claim made by the NGTClaim 1: No permission sought from the NGT by the Art of Living Foundation before the eventFacts:The entire site belongs to the Delhi Development Authority (DDA)Firstly,the NGT is a court,a tribunal and an autonomous institute, not a government agency to give permissions.Here is the official permission letter sought by the Art of Living (Vyakti Vikas Kendra India-Trust of the foundation) to use the land site for the event and it no where states that prior permission from the NGT needs to be sought.As you can see, the permission letter is approved by the Honorable L.G. of Delhi and duly signed by the Office of the Chief Engineer (E.P.)Mr.D.P.Singh of the DDA on 15th of December 2015.Apart from this, the Art of Living organization had taken permission from 20 various organizations and competent authorities (Government Bodies)The entire list of the government bodies is elucidated below:Central Public Works DepartmentDelhi Development AuthorityDelhi Fire ServiceDelhi Jal BoardDelhi PoliceDelhi Pollution Control CommitteeDelhi Traffic PoliceDepartment of Irrigation and Flood ControlDistrict Disaster Management AuthorityEast Delhi Municipal CorporationIndian ArmyIrrigation DepartmentMinistry of Environment and ForestsMinistry of External AffairsMinistry of Home AffairsNew Okhla Industrial Development AuthorityPublic Works Department-DelhiPublic Works Department-UPSouth Delhi Municipal CorporationUttar Pradesh Government2. Claim 2: Alleging the presence of wetlands on the eventFacts:To understand this point, we need to first comprehend the difference between a wetland and a floodplain.According to the Ramsar Convention on Wetlands ,signed in 1971 (Iran) wetlands are defined as: "areas of marsh, fen, peatland or water, whether natural or artificial, permanent or temporary, with water that is static or flowing, fresh, brackish or salt, including areas of marine water the depth of which at low tide does not exceed six metres"(The Convention on Wetlands, called the Ramsar Convention, is an intergovernmental treaty that provides the framework for national action and international cooperation for the conservation and wise use of wetlands and their resources. Number of Contracting Parties: 169)There are currently 26 Ramar sites in India which are enlisted below:Ashtamudi Wetland,KeralaBhoj Wetland,Madhya ParadeshChandertal Wetland,Himachal PradeshChilika Lake,OrissaDeepor Beel,AssamEast Calcutta Wetlands,West BengalHarike Lake,PunjabHokera Wetland,Jammu and KashmirKanjili,PunjabKeoladeo National Park,RajasthanKolleru Lake,Andhra PradeshLoktak Lake,ManipurNalsarovar,GujaratPoint Calimere Wildlife and Bird Santuary,Tamil NaduPong Dam Lake,Himachal PradeshRenuka Wetland,Himachal PradeshRopar,PunjabRudrasagar Lake,West Tripura DistrictSambhar Lake,RajasthanSasthamkotta Lake,KeralaSurinsar-Mansar Lake,Jammu and KashmirTsomoriri,Jammu and KashmirUpper Ganga River,Uttar PradeshVembanad-Kol Wetland,KeralaWular Lake,Jammu and KashmirBhitarkanika Mangroves,OrissaSo, according to the Ramsar sites (India being a signatory of the Ramsar convention) , the World Culture Festival venue does not come under the wetland category.Wetlands come under Ecologically Sensitive Zones are are protected areas by the government.Now,let us analyse the land area with respect to the Survey of India (The Survey of India is India's central engineering agency in charge of mapping and surveying) mapsHere is the Annexure -1B Portion of 1:25000 Scale Detailed Map of Delhi, published by the Survey of India in the year 1985,Under the direction of Major General Girish Chandra Aggarwal, Surveyor General of India; Titled-’Delhi Guide Map,Third Edition 1985′The map clearly depicts the WCF 2016 event site as an extremely flat “Point Bar” (floodplain deposit) without existence of any wetland or enclosed waterbody. The flatness of this land parcel is to the extent that contour indicating difference in height of the order of 100cm is also non-existent throughout the area.Another important point for record in this map is the existence of natural path of ‘Kushak River – Barapullah Drain’ prior to straightening of its channel traversing straight into River Yamuna and filling of its original channel. The map also depicts the situation prior to construction of Guide Bank and DND Flyway.Looking at the National Wetland Atlas (Published in March 2011 by the Space Application Centre,ISRO,Ahmedabad),it fails to indentify a single wetland on the event site.According to the National Geographic Society, floodplains are defined as “A flood plain (or floodplain) is a generally flat area of land next to a river or stream. It stretches from the banks of the river to the outer edges of the valley”A Floodplain does not require jurisdictions of the environmental authorities that a wetland does.Floodplains have a rich history of interacting with the society for civilizations to flourish.The first great civilizations all grew up in river valleys. The oldest, 3300 to 2500 BCE, was along the Tigris and Euphrates rivers in the Middle East; the name given to that civilization, Mesopotamia, means "land between the rivers". The Nile valley in Egypt had been home to agricultural settlements as early as 5500 BCE, but the growth of Egypt as a civilization began around 3100 BCE. A third civilization grew up along the Indus River around 2600 BCE, in parts of what are now India and Pakistan. The fourth great river civilization emerged around 1700 BCE along the Yellow River in China, also known as the Huang-He River Civilization.Many towns have been built across floodplains because of easy access to fresh water,the fertility of floodplain land for farming,cheap transportation, via rivers and railroads, which often followed rivers and ease of development of flat land.Large cultural and religious gatherings taking place on various floodplains across India like the Kumbh Mela and the Maramon Convention.The World Culture Festival venue was thus a floodplain and not a wetland as claimed by the Expert Committee of the NGT.3. Claim 3: Destroying the natural flow of the river YamunaFacts:A comparison of river morphology has been conducted on satellite images for the period between 22nd of December 2000 till 10th of May 2016.Fluvial Geo-morphology of the river (i.e. land form related to the river) and its floodplain (over which the event was organized) indicates continuity of a pattern in channel dimension, sedimentation,bank deposition,bank erosion and flow of the river.On a careful examinations of images between 26th August 2015(month of monsoon in Delhi) to 10th May 2016(Pre-Monsoon Dry Summer Season) no abnormality in the pattern of flow;channel dimension;riverbed;or morphology of both the banks has been noticed.Further,examination by ground verification in a series of field studies found no scientific acceptance of the above mentioned claim by the NGTHere are the satellite images taken that show the continuity of pattern of the river flow:Pic 1: 26th August 2015 (Post-monsoon)Pic 2: 29th October 2015 (Post-monsoon)Pic 3: 23rd November 2015 (Post-monsoon)Pic 4: 15th of December 2015 (Post-monsoon)Pic 5: 27th of February 2016 (Preparation for the event in progress;stage scaffoldings under construction)Pic 6: 15th of March 2016 (Post WCF, Pre-Monsoon)Pic 7: 25th of May 2016 (Depicting the entire area utilized by the World Culture Festival 2016,now evacuated,cleared and all the temporary installations removed)And as I write this answer now (13 th of May 2017, 15:39 hr), I have taken the snapshots from Google Earth of the venue and it is as it was before the event.The above images tell us that there is no change whatsoever in the natural river course of the Yamuna. This again proves the Expert Committee’s claims as false and unscientific.Selection Bias by the Expert Committee members?In the report,the expert committee have replied upon a singular satellite image for the event as opposed to a larger sample size covering pre-monsoon and post monsoon images for 15–20 years despite its availability to the public on Google Earth.The Expert Committee compared the satellite images of the venue dated 5th of Sept 2015 (Peak monsoon season) with that of a mid summer picture of March 2016, post the event (Summer pre-monsoon)It doesn’t take an Einstein to realize that such a comparison cannot be done in the first place.The Expert Committee is basically trying to attribute the negative effect of the lack of rain to the World Culture Festival !4. Claim 4: Destroying the reeds,grasses,natural vegetation on the river bed and the venueFacts:High pollution in the River Yamuna has led to a situation where dissolved oxygen tends to zero (often less than 1), high load of suspended particulate matter, high turbidity almost blocks the sunlight penetration in the water within few centimeters of vertical depth etc.All of the above factors collectively create a situation where no macro flora could grow or anchored within the riverbed of the Yamuna.Reeds and grasses could only grow beyond the riverbed in the riparian zone of the floodplain.The images of the last 15 years,indicate that the floodplain around the venue had some strips and patches of reeds. Those patches have been compared with the images after the event and on a comparison of images, no change in area covered by reeds has been observed.Riparian reeds along the Barapullah drain and small patches behind and in front of the guide bank (near the bridge of DND flyway) are unaltered before , during and the period post the event.Also the number of trees before and after the event were counted using high risk satellite images and they were found to be the same.Pic 1: Regrowth of grasses on over the location where once the stage was raised (17th of April 2016)Pic 2: State of grasses on the event venue (17th of April 2016)Pic 3: Undisturbed riparian reeds along the abandoned channel in front of the Guide Bank (17th of April 2016)Pic 4: Undisturbed riparian reeds along the Barapullah Drain (17th of April 2016)Also, another important to note that the Expert Committee of the NGT accused the Art of Living Foundation for bringing in and dumping the malba (construction debris) and flattening the malba thereby destroying the flood plain.Firstly, as shown by the satellite images of the land parcel furnished earlier, the purple area indicates the unsolicited construction debris that existed since the year 2000.Trucks were seen emptying tonnes of malba on a daily basis when the Art of Living began preparation for the event (circa December 2015)Once the land was allotted to the foundation, the Art of Living sent out a letter the the DDA citing these concerns about the existing construction dump.To which, the DDA didn’t remove the debris whereas told the Art of Living Foundation to remove the malba at its own cost !Here, is the invoice of the contractor ,who was given the duty to remove the debris from the venue site ,under the instructions the Art of Living Foundation (Cost borne by the Art of Living Foundation !)This is how the site appeared before the event:So, why did the NGT falsely accuse the foundation wherein no cementing/foundation work of any sort was undertaken for the event?5. Claim 5: Disturbed the aquatic life of the riverFacts:The river Yamuna (Delhi stretch) is nearly devoid of fish species due to extremely low level ( ~0) dissolved oxygen in the river water.Let us analyze the water quality of the river Yamuna first.Referring to the “Water Quality Status of Yamuna River” report by the Central Pollution and Control Board (erstwhile Ministry of Environment and Forests,Government of India),here is the longitudinal profile of the dissolved oxygen.(Report foreword by V. Rajagopalan, Chairman-CPCB)(Notice the graph points near Nizamuddin Bridge ,Agra Canal)The report goes out further to state that“ In Yamuna River low BOD and low DO was observed more oftenly may be due to consumption of oxygen by settled sludge in the riverbed.”(Ref 3.13, page No.42)Other key notations from the report are listed below:“The sources contributing pollution are both point & non-point type. Urban agglomeration at NCT – Delhi is the major contributor of pollution in the Yamuna River followed by Agra and Mathura. About 85% of the total pollution in the river is contributed by domestic sources. The condition of river deteriorate further due to abstraction of significant amount of river water, leaving almost no fresh water in the river, which is essential to maintain the assimilation capacity of the river.”“In the critically, polluted stretch of Yamuna river from Delhi to Chambal confluence, there was significant fluctuations in dissolved oxygen level from Nil to well above saturation level. This reflects presence of organic pollution load and persistence of eutrophic conditions in the river.”“Bacteriological contamination is significantly high in the entire Yamuna River stretch. Total Coliforms are generally well above the prescribed water quality standard even sometimes at Yamunotri also. The microbiological analysis confirms that the bacteriological contamination was predominantly contributed by human beings.”Here are the longitudinal profiles of the Total and Faecal Coliforms in Yamuna River :Not to forget the drains opening up in the Delhi stretch.“Najafgarh drain of NCT – Delhi is the biggest polluter of River Yamuna, which contributes about 26% (year 2001) to 33% 22 (year 2000) of total BOD load and 48% (year 2003) to 52% (year 2001) of total discharge that joins Yamuna river and canal at Delhi by various drains. There are 70 sub drains that join main Najafgarh Drain. The study indicated that the total BOD load received by Najafgarh Drain through sub-drains was 136 TPD, whereas the BOD load at the terminal end of the Najafgarh Drain was 83 TPD only. This reduction may be contributed by biodegradation, deposition of setllable material at the bottom and diversion of drain water for irrigation etc”.“ River Yamuna receives significantly high amount of organic matter, which is generally, originates from domestic sources. For biodegradation, this organic waste requires oxygen, causing significant depletion of dissolved oxygen in river water. The oxygen depletion not only affects biotic community of the river but also affects its self-purification capacity. This problem is critical in the river stretch between Delhi and confluences of river with Chambal. In Delhi stretch, the load of organic matter is so high that it consumes the entire dissolved oxygen available in river water.”Presenting another latest report by the Central Pollution Control Board titled “Water quality status of in Delhi stretch of Yamuna River”Exhibit 1: Water quality of river Yamuna river in terms of Dissolved Oxygen (DO)The above graph clearly shows that the standard DO should be~4–5 whereas near the Nizamuddin bridge and Okhla region it below 1.Exhibit 2: Water quality of river Yamuna in terms of Total ColiformThe report also throws light on the discharge of various drains in the river Yamuna.“There are twenty one major wastewater drains in NCT-Delhi, out of which 18 drains join Yamuna River and rest joins Agra/Gurgaon canal.All the drains join Yamuna River downstream of Wazirabad barrage.These drains are being monitored regularly on monthly basis.The range of total BOD Load of 18 drains join Yamuna river was 105 TPD (August, 2015) to 229 TPD (January, 2016).Total discharge of these drains was varied from 29 m3/s (May, 2016 to 43 m3/s (August, 2014).The collective average of these drains for the year 2015 and 2016 in terms of discharge was about 34.8 m3/s and 34.3 m3/s respectively whereas, BOD load average for these two years was 164 Tons/day (TPD) and 178 Tons/day respectively.Based on the Discharge and BOD load of 18 drains Najafgarh drain was the biggest polluter of Yamuna River followed by Shahdara drain. These two drains alone contributes about 74% of total Bod load and 82% of total discharge of the 18 major drains that join Yamuna river at Delhi.”Exhibit 3: Discharge of major drains joins Yamuna River at DelhiLastly, I wish to produce a report titled “ Restoration and Conservation of River Yamuna” authored by the NGT Expert Committee members itself in the year 2012–13(Authors Prof. C.R.Babu, Prof.A.K.Gosain, Prof.Brij Gopal-All being expert members of the NGT)The report categorically states that“the loss of life supporting potential of the river is the major concern to the public, the Government and the courts”“the flowing water, the river bed, the floodplain forest and grassland ecosystems are locally extinct”Here is a snapshot of the same :The report also states that:“The Delhi urban stretch of 22 km in the downstream of Wazirabad barrage upto Okhla barrage (Section III) is critically polluted and dry weather flow is almost the treated and untreated sewage from 22 drains and the fresh water flow from upstream or lateral connection and it is perhaps one of the most polluted river stretches in the country with zero DO and over 30 mg/l BOD levels”Whereas, the same committee members in its final report slamming the Art of Living state the following:How could the World Culture Festival destroy something that according to the same committee members didn’t even exist in the first place.Why is the Art of Living blamed selectively for the pollution of the river Yamuna over the past decades?Isn’t this nothing but sheer hypocrisy?From the above data, following points to be noted:Yamuna river (Delhi stretch)is a dead river with almost zero dissolved oxygen, high amounts of pollutants and no fresh water. How can aquatic life survive under these chemically harsh conditions?The discharge of major drains in the Delhi stretch of the river along with industrial effluents and the pollution levels of Yamuna is alarming.Why does the NGT put the blame on the Art of Living Foundation which has done zero damage to the floodplains and the river?What has NGT done to curb the industrial and human pollution which are harming the river Yamuna?There is a strong judgmental bias in the current NGT report Vs the Art of Living and Others6. Claim 6: Alleging compaction and leveling of the floodplainFacts:Before going to analyse the charges of compaction, it is quintessential that we first define the nature of the land where the event was conducted.According to the report “Environmental flow for monsoon rivers in India-The Yamuna river as a case study”, the Yamuna floodplains has alluvial sandy soil (Reference: Rao, S.V.N., Kumar, S., Shekhar, S., Sinha, S.K. & Manju, S. 2007. Optimal pumping from Skimming Wells from the Yamuna river flood plain in north India. Hydrogeology Journal 15: 1157-1167)According to one of the Expert Committee member-Prof.A.K.Gosain’s earlier published research paper titled- “A new scheme for large-scale natural water storage in the floodplains: the Delhi Yamuna floodplains as a case study”, the author says “the river has been bringing sand from the mountains and depositing it along its basin, forming the floodplains. This accumulated sandy layer exists to an average of depth of 40 m”The report earlier furnished in claim 5 by the expert committee members itself (Can be found here) states that the floodplain near the river Yamuna has “sand and gravel”.Hence, it is a well established fact that the floodplain has sandy soil ! So, can sandy soil be compacted ?Now,given the above data, let us go through some scientific studies about sand compaction and verify the allegations by the NGT.For a confirmatory statement on the extent and exact reason of consolidation and/ or compaction in qualitative and quantitative terms, laboratory test of undisturbed soil samples from the land parcel will be required.By comparing the current soil density with the previous records of soil density over the land parcel,the difference could be worked out.But conducting such a test of unconfined sand/sandy soil appears almost impossible due to the established principles of soil mechanics.Referring to the established principles of soil mechanics and geo-technical engineering from the widely accepted and used textbook for soil mechanics by Prof.V.N.S.Murthy tiled “A Text Book of Soil Mechanics & Foundation Engineering” let us go through the pressure-void ratio curves of sandIn the above curve, it is clearly evident that “ more than 90 % of the compression has taken place within a period of less than 2 minutes. The time lag is largely frictional. The compression is about the same whether the sand is dry or saturated”.“The amount of compression even under high load intensity is not quite significant as can be seen from the curves.”It is obvious that the natural consolidation of this land parcel would have taken place in the geological past immediately after the deposition with some movement of animals and humans over it.It appears from the final report that the expert committee didn’t conduct any geo-technical analysis and not a single report was attached as an annexure to their claim.Verbally saying that they went there and saw the top soil layer become a thick crust is not evidence. There are tests that are legally permitted in the courts of law which the expert committee doesn’t seem to have done.The WCF area occupied ~ 25 hectares of land out of the total 9300 hectares of the floodplain (Approximately 0.26 %)So to exert the high pressure for land compaction it would require numerous heavy weight rollers (which apparently weren’t used by the organizers).Furthermore, the curve for dense sand in ‘Void ratio v/s pressure in kg/sq.cm’ indicates that dense sand (as deposited by the Yamuna and Ganga) does not show noticeable changes with increase in pressure.It is an undeniable fact that this land parcel has been under agricultural practices since decades (if not centuries).Agricultural practices; tilling (harrowing); movement of farms equipment and agricultural machinery; movement of dumpers for unabated dumping of construction waste for years and then the movement for trucks and dumpers for removal of the same has already shaped the consolidation and /or compaction of this land parcel ages before the event of the World Culture Festival 2016 was organized.Moreover, it is important to note that the entire stage for the event was supported by a series of iron scaffolding with raft footing (shown in the figure below)An Engineering Marvel ?Nothing was anchored in the natural stratum to hold the stage, overall the stage had a floating foundation. Can’t believe? have a look at this :Pics: The stage was made of thousands of such scaffolding rods in lattice structure spread across 7.5 acre (stage area)Pic: Scaffolding structure (showing the highest level) used for construction of the stage ; photographed during the removal of the stage.Pic: Steel plate rod used for distribution of the load, without any anchorage in the ground; photographed during the removal of the stage.The stage had negligible impact on the ground. Overall the stage was a floating stage and the impact of a floating stage on sandy soil is insignificant. For the record, no cement foundation was done as can be seen from the pictures.Trivia: The physics behind this stage bears a strong visual analogy with a yogi sleeping on a bed of nails. As a matter of fact, this ancient technique used by hathayogis in India has been a source of inspiration for the design of this stage !Pic: The concept of “Yogi Nail Bed” used as an inspiration for the WCF stage is based on the principle of uniform distribution of weight over a large surface area, therefore the overall impact is extremely low or negligible.Finally, the only court permissible test to determine compaction of soil is the CBR Test (California Bearing Ratio Test). It involves taking soil samples before and after the event and then applying the test. Since the expert committee did not collect any soil samples before and after the event to come up with the alleged 13 cr damage,the Art of Living Foundation themselves requested the NGT to conduct the CBR test at the venue, and the application was duly rejected !Here is the permission letter made by the Art of Living Foundation to the NGT (which was disposed off by the NGT !)The Chairman of the Expert Committee of the NGT Mrs.Shashi Shekhar (IAS) (Ministry of Water Resouces, Govt. of India)even goes out to the extent of saying the compensation of 120 cr put forth on the Art of Living foundation as ad-hoc and unscientific and not based on any scientific assessments. The Chairman does not even endorse the compensation.Also, it is important to see whether the NGT conducted any scientific studies before quantifying the damage if any ?It would be very astounding for the reader to realize that no such thing was done. Only a mere “visual inspection” was conducted by the Expert Committee members of the NGT at the venue on the 6th of June 2016(Singular visit).And no scientific evidence and data samples have been provided by the NGT Expert Committee in the Court of Law.That’s like going to a doctor who after just glancing at your direction hands you a list of ailments he assesses that you suffer from and proceeds to slap you with a fat bill for your future treatments !It is surprising to believe that the Chairman of the Expert Committee Mr.Shashi Shekhar has distanced himself from the committee’s recommendations. The Chairman’s signature is also missing from the final report. And only 4 out the 7 Expert Committee members have signed the final report !7. Claim 7: Going from ecological “restoration” to ecological “rehabilitation” of the floodplainFacts:Throughout the first report, the Expert Committee members of the NGT have used the term “restoration” and in the final report they use the word “rehablitation”Why the sudden switch?Because, the NGT Expert Committee cannot prove any damage that was done to the floodplain and the environment by the event.In their final report this is what they state:It is not possible for the ‘Expert Committee’ to assess the ecological status of the site before and after the event? This was their primary job in the first place !Also, the committee points out that it is extremely difficult to assess the costs of environment damage and degradation accurately because“it requires substantial time, human and other resources to collect detailed quantitative information on the nature, extend and magnitude of various activities listed earlier for restoration”That’s a clever way of saying that they cannot prove the damage quantitatively and qualitatively and hence the question of restoration is redundant.They also go on to state that “estimation of the costs of restoration requires the preparation of a Detailed Project Report that may take several months to a year besides financial resources.”Who can buy that argument? Why was the Expert Panel commissioned in the first place?Moreover, the Expert Committee states that it has now decided to “REHABILITATE THE IMPACTED SITE”.The NGT’s proposed plan includes creating a bio diversity park, two large water bodies, three tier planting of vegetation,and establishing new sewage treatment plants,etcLet’s put things into perspective, firstly, the Committee says that it cannot prove any damage scientifically. Consequently they cannot assign costs to restore damage. Hence, they wish to switch the narrative from being a “restoration cost” to “rehabilitation cost”. And moreover, they wish to build a utopian biodiversity park for which the Art of Living should bear the cost ! (Slow claps !) Wait, I am not yet done !In order to build this dream park, the Expert Committee has submitted a ‘Budget’ for building this park. The budget lists ‘Salaries and Consultancies’ as a cost component to monitor and supervise the construction. This cost component totals up to 7 CRORE RUPEES ! Here is their estimations !That’s not all friends. The NGT Expert Committee even goes on record to state that “rehabilitation” will take a period of 10 years and the expert committee members have nominated themselves to undertake this project as “Consultants”(Indeed a very sly way of pocketing the 7 crore!)In the final analysis,the expert committee members in their final report state that they are unable to differentiate the activities required to restore the floodplain and the activities to undo the alleged damage due to the event. That is a very clever way of saying that they cannot differentiate the damage done to the floodplain before the event and the damage done by the event.As this article says,“The Art of Living case will go down as a test for environment activism in the country. The nation expects the NGT will gather enough courage to call the bluff of the committee and go by the merits of the case”.References:http://delhi.gov.in/wps/wcm/connect/55a9380047b2199a9155d5bdc775c0fb/Final_Report_NGT-Yamuna_Restoration%2B(11-4-2014).pdf?MOD=AJPERES&lmod=-287594179https://arxiv.org/ftp/arxiv/papers/1306/1306.2709.pdfhttp://www.cpcb.nic.in/newitems/11.pdfDelhi Development AuthorityTextbook of Soil Mechanics and Foundation EngineeringGoogle Earth – Google EarthNational Green TribunalHomepage | Ramsarflood plainCentral Pollution Control Board :::https://www.artofliving.org/in-en/newsroom/press-statement/independent-environmentalist-statement-ngt-reporthttp://www.who.int/water_sanitation_health/dwq/2edvol3d.pdfWill NGT call the Yamuna expert panel's bluff?Written by:Soham D’SouzaBachelors in Chemical Technology-Institute of Chemical Technology (former UDCT) , MumbaiMumbai

In NZ, what happens if a uni student doesn't qualify for student allowance or benefits due to parents' income, but the parents refuse to financially support, and the student can't work due to mental illness?

I’m in the same position. I saved up six grand going to garage sales and selling the stuff online. Then I applied for a student loan which paid my course fees and gave a maximum amount per week (164$ I think it was) which is about 50–100$ less than you need to live. My six grand paid for my many 100-200$ books and topped up the student loan amount so I could cover Living costs and would not need a job.So for one year it cost my six grand savings and the maximum amount of available loan which was 15,000. So 21k for the year living and eating well.You need to apply for a student loan. They’ll give you 164 a week (approximately) and pay your course fees (900$ per paper) you’ll need to live very frugally or have some way of topping that up (job, parents or your own savings, You have to pay that back: as soon as you get a job they garnish your wages to recover it. If you have no job ever again you’ll not be forced to pay it back. If you get an inheritance when Your parents die then ird will know= not sure how they deal with that however.No sickness benefit for students even if you’ve been on that for 20 years. No dole. If you get disability support payments (for doctors fees etc) you might keep some of these= you need to ask winz about that. You cannot stay on sickness or unemployment benefit when you’re a student.. I’m really smart, very confident and love knowledge, and have gotten effortless A’s my entire life. I’m good with basic computers and a gifted writer. I, however lacked self discipline as I wasn’t used to having to work too hard to be good at something, and if it was too hard for me (maths, video games for example) I would not even try. I had been in mental health remission for 3 years before I felt well enough to tackle university. I am 45 years old and I have borderline personality disorder, which is a mood regulation disfunction: your brain thinks you’re being attacked by a puma and triggers chemicals to be released into your bloodstream, but in reality you’ve just missed the bus or dropped your sandwich (or gotten a C) you cannot change or control this you have to develop copIng skills in psychotherapy and practise unitll they Become stroNg enough to override your unhelpful or destructive behaviours.But being smart is not what it takes. It takes self discipline, huge amounts. Time management from day one. If you fall behind in week one or two you’ll play Catch up all semester. You need to be able to concentrate and focus for two hours in lectures with very dense material. Each lecture has 20–40 pages of reading academic content before you sit down and then they build on that. These pages take five times longer to read than any other kind of writIng because 25% of the words they use will be new to you. Youll be expected to have these terms memorised from their first use, as they won’t give more than an initial definition = you will need to write flash cards or a glossary of terms and learn these by rote or you won’t inderstand what you are reading. When I began it would take 90 minutes to read one academic article and I would not understand it at all, and have to read it again taking notes and googling definitions and writing flash cards. By the end of the year, I could do it in 45 mins, know what most words meant and be able to regurgitate it well enough. Each essay requires you to read 4–16 of these and reference them, So you gotta get good fast!I studied psychology as my major and becuse you can only take 4 psychology papers in the first year i took sociology, anthropology, comparative religion, human development, And maori. (You need to pass first year papers in any subject you want to do in your second year)Every 2 weeks most classes will want a piece of written work from you: each one has to have 4–16 references so you’ll have to read 4–16 articles for each piece of work. you do this In Your own time and no one watches you= it’s 100% self discipline. I had to put my life on a clockwork schedule to keep up. If you don’t turn in your work no one says anything, or follows up, you just fail the paper.Essays get marked really really hard. Miss one comma, or fail to put a colon in a reference you loose 1%. you make this mistake five or six times then you drop a whole grade. You can get 100% on content and still get a C grade becuse you haven’t achieved the academic writing style. They don’t care about your content for the first year, they want you to become a good writer. You need this skill before you can have an opinion. If you don’t keep your grades up to minimum standard you fail, my psychology course requires a A- average from day one to be accepted to the next level. So you have to improve each time. Word limits are 1200 intially but are up to 3000 by the end of the year. You need to be good at essays before you begin, and you need to be good at research before you walk In that dooor becuse you’ll be asked to perform at level seven from day one. If you’ve not worked at level 5 or 6 before this will be a huge ask as you scramble to learn new skills ontop of courseload. They Expect you to be good at these things already. I had to go to every single help session available to learn how to get up to speed to keep my grades up. I had a weekly appointment with 2 student academic help staff= one helped me with maths and computer programs (that psychology dept assumed I knew how to do already) , and one to help me with academic writIng style and time management.Work load is about 60 hours a week to get good grades. Smart people can get away with 40 untill exam month. Not-so-smart people can actually get higher grades than smart people by working Harder. That means after a full day of lectures and tutorials you have to fit in 4–6 hours of study every day. Then You have to go home, have dinner and do 2 or 3 hours more study, and on the weekends you catch up on the stuff you didn’t get done. Remeber every second week you get at least one new assignment per paper, so if you get behind on one, the time you think you’ll have to catch up will be taken up by the new piece of work you’ve been given. Do not fall Behind! you cannot make excuses to staff, they do not care why you didn’t or didn’t do it. You get the grade you earned and that’s it. No do-overs or “I was sick” at university: they have 3000 millennials telling them this every day: and so they have to have a “no excuses” policy or all these staff members would do from sunrise to sunset is make arrangements for 3000 teenage kids, who didn’t do the work, and want to be allowed to repeat the exam or essay again. Death of immediate family member counts but your brothers wedding on the day of your exam? Not acceptable. If you go to that wedding you get 0% on the exam. One 0% on one thing and you may fail the whole paper. (This failed grade stays on your record forever, no pretending you ‘never took it’) it’s almost impossible making up from a zero: even if you’ve had all As so far, you’ll get a C overall with a zero exam result. Exams take precedence over your brothers wedding so make sure he knows your exam dates before he proposes lolNow: mental heath. The stress of university is extreme from day one. You fall behind: the pressure goes up. You don’t practise sleep hygiene and eat really really well, (no junk food) your body won’t be able to fuel 40-60 hours of usiNg your brain, you’ll get tired and run down and the info won’t go in. Being psysically fit before you start and getting regular exercise while you study is so very important to keep your energy levels up. Go in unfit with a junk food diet and not being able to get 8 hours sleep: your body will not have the fuel to keep your brain going at that high level for the 8 hours a day you need to. If you can’t focus on a task without being distracted for one hour you won’t be able to stay focussed in lectures, you need this skill in the first week, best to be good at this before you begin. Even the most healthy peolpe with best efforts are exhausted by the end of semester and this is right when when exams start. You have to still go to lectures, still do your weekly assignments and study for exams on top of that. So if you’re behind at this stage, you’ll find it hard to ramp up the effort another notch= which is an understatement. You’ll have to ramp up the effort five or Six notches.Does this sound stressful to you? Wait unitll you stand inside it and see what your mental illness does to you. Mine came out of remission and nearly killed me. I’m still sick nearly 2 years later. I got all A’s in first semester but the end of semester I was so unwell that I Barely Managed to pass my exams let alone get the grades to keep me competitive. And it’s VERY competitive: 500 people start and 8 get though to masters/honours and 2 inTo the PhD program. Then to be a clinical psychologist you need to get into the 2 year clinical programme for which you need A- average minimum over that last 4 years AND be an gifted communicator AND have extra curricular activities that show your leadership skills and your passion for the work. (yes kids you need to volunteer in your field durIng your university years to be eligible for the clinical program : because there are 8 places and 30 applicants and every single one of them is top of their class)The university (Waikato should be ashamed) was unable to support my mental health= 6 weeks waitIng lists for counsellors who were average as best. A system that bends over backward for wheelchairs but cannot touch The Sides of a personality disorder. They will let you sit exams alone in a room instead of with 500 others and you can get about 10% extra time, which is a huge advantage in my opinion.So before you start, make sure you have all the skills you need beFore you go. Learn how to write essays, learn the computer programs for writing and editIng essays before you go. Be able to break a paragraph down into points, be able to map out paragraphs into an essay. Be able to type pretty well before you go. learn how to use the computer programs you need before you go: they will tell you to make a graph using excell but they won’t train you how to use excell, they will tell you to write an essay in academic style but they won’t teach you academic style= you scramble to teach yourself from a book. They expect you to be able to use the language of your field but they don’t teach you the language of your field= you scramble to teach yourself with glossaries and flash cards. my first semester I learnt 400 new words because they appeared in my required reading and I didn’t know what they were: I had to google them, then I wrote them on flash cards and I used to flip through my cards as I walked between classes to memorise them.And this whole time no one will care how hard this is for you, because everyone is in the same boat and everyone who is a year above you has already been through it and they survived becuse they dug in and worked harder and that’s exactly what they think you should be doing. CryIng about mental illness won’t get you anywhere. I became a mental heath advocate at university becuse so many of us were not getting the support we needed. Waikato told us they could support our mental health and we believed them, but it was all a show for the DHB, behind the scenes there was no funding and meagre resources and staff spread too thin. All the cash went on wheelchair access and help for blind/deaf students. We worked really hard to get things they promised us in the curriculum and we achieved almost nothing and the staff were very angry at the group of us who protested and actually began to bully us. Do not go to the university of Waikato if Cara Tate is still head of disability services there. She treated us like toddlers having a tantrum instead of the well organised group of activists we were. Whatever university you choose make sure the mental heath services are adequate, And they have adequate funding. Waikato lost almost all of its mental health clients even ones who had been there 2 or 3 years when their system broke down. They asked their staff to cover 300% more clients for the same money and the staff all quit. Then I arrived and there was almost zero support. Most universities have huge problems in some areas. Waikato had huge problems in disability services and it hurt some of the most vulnerable members of the student body.Now add on top of that: it’s your first year at university : you don’t know anyone, you don’t know where your classes are and have to find them- (I got lost every day for the first month), you don’t know who to ask for help becuse you don’t know what help you need, everyone is in the same boat as you and is just as overwhelmed as you, most staff and students ahead of you are so sick of entitled millennials and lazy whiners there is no sympathy: they came though it by knuckling down and working harder and that’s what they expect you to do. You’re not considered as a serious student untill your second year and sometimes your third. And yes, that’s becuse a huge percentage fail. Most people fail. It’s academic level study. It’s meant to weed out the average and the undisciplined and so it does. It also weeds out the mentally ill unfortunately. 20% of people have a degree in NZ. many cannot afford it, and 80% of people who can afford it wont to be able to work to that standard and fail to graduate.What subject are you studying? Some are much harder than others. Hard sciences like physics etc are very dense and dry and you need to be talented and passionate for these fields. Engineering is super competitive and you need to be gifted in this mathematical way. Psychology is very academically dense and requires more personal growth than any other subject. You need to be a natural communicator and have great empathy and really care about people . You need to be able to have discussions in tutorials and are marked as fail if you stay silent all year. Shy people either overcome it or go into advertising. It’s so fascinating but every day you have to challenge your own behaviours and beliefs and self reflect. Law has a huge workload and lots of reading every week and lots of case studies to write. I picked this up by having friends who take these subjects but I don’t know too much. Anthropology was a excruciatingly hard because the four exams consist of 6 short essays (300 words) and one long essay (600–800 words) and you have 50 minutes. Oh yes you heard me right. It’s about 7 minutes per essay and ten for the long one. 55% of my class failed. But I had an extra 10 mins for my disability allowance. We knew the questions in advance. (Two that I remember were “Describe and date the five innovations that effected the pacific migrations” And “For what reasons was the Second World War described as being “the war on two home fronts”) So I wrote out my essays before hand and spent 2 days memorising them so I could write them out in the exam. Cheating? Or exploiting my ability to memorise scripts from my acting days? 84% thank you very much! (Not an A but good enough to get me top twenty out of 400 students) philosophy has huge amounts of readings and many many essays with incredibly strict criteria and you also have to speak in front of the class - many failed because they were unable to complete this and skipped that day and every day it was required. They didn’t realise failure to overcome fear of public speaking meant they could be the best philosopher in the world but for this reason they fail first year and are not allowed to take it in second year and thus ends that career.Grades: you can get a degree with Cs. But you cannot get a job with a C pass degree. Every highly paid job has 100 applicants who all have (minimum) bachelors degrees, they only interview A pass applicants. And I’m telling you right now there will be ten of them on that pile. A bachelor of psychology doesn’t not qualify you for anything except Human Resources or advertising, and a C pass won’t even get you that job.You cannot be a psychologist with a bachelors: you need a masters or honours or doctorate minimum. You won’t get into masters if you don’t have a A - average across your entire 3 years. You get one C and you need 2 A’s to balance that out to keep your average up to minimum.Now: I’m hoping you are going to pick an easy subject. I got 100% in maori because the entire semesters workload was equal to about 2 weeks of psychology workload. That easy paper made up for my lower grades in psych at the end of the year when I was too sick to function. I didn’t pick maori becuse I knew it would be easy, but I was so grateful because psychology and anthropology where extremely dense and had so many assignments and so much reading. Maori was youtube clips and powerpoints and tiny little 500 word essays I could write during the PowerPoints. 40% drop out rate for that class, next highest: psychology at 19% The “maori way of learning” didn’t make up for the laziness of the Maori students, many spent the entire class playing on their cell phones if they even bothered to show up. I experienced racism for the first time in that class because I was pakeha and this got worse when I got top scores for exams and essays. The emphasis was on the fact that I could not speak te reo. Hell I studied mandarin for a year and got nowhere, I took French and German at highschool and learnt nothing so clearly I suck at languages! No way was I going to learn a new language ontop of the rest of my workload but that logic did not lessen the disdain they showed me for my lack. It was very confusing for me to looked down on by people who needed 5 extensions on a 500 word essay on their own own Iwi. No extensions in any of my other papers were allowed.So, I hope I’ve scared you. Becuse if you don’t understand what is required before you go in you’ll be blindsided, if you have mental illness you’ll need a plan, and you’ll need to know your early warning signs and have ways to counteract these before you get too far gone.So. Get your fitness up if you’re unfit. Time yourself to concentrate for 15 mins without looking up, then go for 20, then 30. Train your mind to focus for one hour without getting distracted. If you are a millennial your cell phone addiction will be your greatest enemy: practise not looking at your phone for an hour, if you can’t then you need to train yourself bit by bit. You won’t have time to practise this skill during lectures, if you zone out fir even 30 seconds you can be lost for the next ten minutes and that’s you catching up at home and possibly one mark you won’t get on the exam. Get a book on academic writing style and practice before you go (apa referencing) Google “how to do academic research” and practise and practise= it’s the single hardest skill to develop (in my opinion) (hint: only academic sources are considered viable: you cannot refrerence ANYTHING you find on google, it has to be the original work and finding that original work can be very time consuming). Find some academic articles and read them to get an idea of the language and format you must use. For psychology you need to know how to write A lab report and use excell calculations to produce graphs from Raw data. You’re maths needs to be able to pass a statistics paper in second year so get a tutor right away if you can’t grasp mathematical formulae. Learn to use ENDNOTE before you go or you’ll be doing Ten times the work long hand to reference your research.I wish someone had told me this before I went. I knew it would be hard but it was ten times harder than I thought. I got a B+ average. But my mental health has yet to recover and I owe the government 15 thousand dollars and at this stage I could not return to study, Hell I can’t even organise my socks at this point.I’m going to try doing one paper at a time each semester (and stay on the sickness benefit which is free and doesn’t need to be paid back) then maybe two per semester as I struggle towards remission and try to piece it together that way. I may never achieve my goal of having a degree but I’m not giving up= I’m just putting my mental health first. Learned that the hard way.most mental health clients I met at university took two papers per semester (full time it’s 4 papers for each of 2 semesters) to lessen the pressure. This means much more student debt as it takes twice as many years you need to borrow that 164$ a week.I admire them! My mate took 6 years to get an accountancy degree because he had Aspergers (and a non supportive family) he worked as hard as he could and he graduated at 26. He got a job right away. Becuse he’s so awesome and very good at accounting (Aspergers actually helps with this he reckons) all he wanted was a shot at a normal life, and he has evEry chance at that now. After a lifetime of poverty lving in a sh%@#hole and eating 2 minute noodles he has a cute rag top car and is saving for a house deposit. The woman who marries him will be blessed by his strength of character and will to overcome. His kids will have a wonderfull example to follow.Good luck. Don’t let this stop your dream. Just be really prepared, accept you might have to go slowly, and put your mental health first . I didn’t, and I wish someone had written me this letter before I went. I still would have gone. But I would have done it differently and focussed on my mental health and support systems and fitness and self discipline before I went. I’ve lost a lot to this illness. I will probably never get to be a psychologist even though I’m born to do it. But accepting that these are the cards I was dealt and trying to play a good hand anyway is all I can do.

Why and how are pages rated high or lower?

is an algorithm used by Google Search to rank websites in their search engine results. PageRank was named after Larry Page,[1]one of the founders of Google. PageRank is a way of measuring the importance of website pages. According to Google:PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.[2]It is not the only algorithm used by Google to order search engine results, but it is the first algorithm that was used by the company, and it is the best-known.[3][4]Contents[hide]1Description2History3Algorithm3.1Simplified algorithm3.2Damping factor3.3Computation3.3.1Iterative3.3.2Algebraic3.3.3Power Method4Variations4.1PageRank of an undirected graph4.2Generalization of PageRank and eigenvector centrality for ranking objects of two kinds4.3Distributed algorithm for PageRank computation4.4Google Toolbar4.5SERP rank4.6Google directory PageRank4.7False or spoofed PageRank4.8Manipulating PageRank4.9Directed Surfer Model5Social components6Other uses7nofollow8Deprecation9See also10References10.1Citations10.2Sources11Relevant patents12External linksDescription[edit]Cartoon illustrating the basic principle of PageRank. The size of each face is proportional to the total size of the other faces which are pointing to it.PageRank is a link analysis algorithm and it assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is referred to as the PageRank of E and denoted by [math]{\displaystyle PR(E).}[/math] Other factors like Author Rank can contribute to the importance of an entity.A PageRank results from a mathematical algorithm based on the webgraph, created by all World Wide Web pages as nodes and hyperlinksas edges, taking into consideration authority hubs such as cnn.com or usa.gov. The rank value indicates an importance of a particular page. A hyperlink to a page counts as a vote of support. The PageRank of a page is defined recursively and depends on the number and PageRank metric of all pages that link to it ("incoming links"). A page that is linked to by many pages with high PageRank receives a high rank itself.Numerous academic papers concerning PageRank have been published since Page and Brin's original paper.[5]In practice, the PageRank concept may be vulnerable to manipulation. Research has been conducted into identifying falsely influenced PageRank rankings. The goal is to find an effective means of ignoring links from documents with falsely influenced PageRank.[6]Other link-based ranking algorithms for Web pages include the HITS algorithm invented by Jon Kleinberg (used by Teoma and now Ask.com),the IBM CLEVER project, the TrustRank algorithm and the hummingbird algorithm.[citation needed]History[edit]The eigenvalue problem was suggested in 1976 by Gabriel Pinski and Francis Narin, who worked on scientometrics ranking scientific journals,[7]in 1977 by Thomas Saaty in his concept of Analytic Hierarchy Process which weighted alternative choices,[8]and in 1995 by Bradley Love and Steven Sloman as a cognitive model for concepts, the centrality algorithm.[9][10]PageRank was developed at Stanford University by Larry Page and Sergey Brin in 1996 as part of a research project about a new kind of search engine.[11]Sergey Brin had the idea that information on the web could be ordered in a hierarchy by "link popularity": A page is ranked higher as there are more links to it.[12]It was co-authored by Rajeev Motwani and Terry Winograd. The first paper about the project, describing PageRank and the initial prototype of the Google search engine, was published in 1998:[5]shortly after, Page and Brin founded Google Inc., the company behind the Google search engine. While just one of many factors that determine the ranking of Google search results, PageRank continues to provide the basis for all of Google's web search tools.[13]The name "PageRank" plays off of the name of developer Larry Page, as well as the concept of a web page.[14]The word is a trademark of Google, and the PageRank process has been patented (U.S. Patent 6,285,999). However, the patent is assigned to Stanford University and not to Google. Google has exclusive license rights on the patent from Stanford University. The university received 1.8 million shares of Google in exchange for use of the patent; the shares were sold in 2005 for $336 million.[15][16]PageRank was influenced by citation analysis, early developed by Eugene Garfield in the 1950s at the University of Pennsylvania, and by Hyper Search, developed by Massimo Marchiori at the University of Padua. In the same year PageRank was introduced (1998), Jon Kleinberg published his important work on HITS. Google's founders cite Garfield, Marchiori, and Kleinberg in their original papers.[5][17]A small search engine called "RankDex" from IDD Information Services designed by Robin Li was, since 1996, already exploring a similar strategy for site-scoring and page ranking.[18]The technology in RankDex was patented by 1999[19]and used later when Li founded Baidu in China.[20][21]Larry Page referenced Li's work in some of his U.S. patents for PageRank.[22]Algorithm[edit]The PageRank algorithm outputs a probability distribution used to represent the likelihood that a person randomly clicking on links will arrive at any particular page. PageRank can be calculated for collections of documents of any size. It is assumed in several research papers that the distribution is evenly divided among all documents in the collection at the beginning of the computational process. The PageRank computations require several passes, called "iterations", through the collection to adjust approximate PageRank values to more closely reflect the theoretical true value.A probability is expressed as a numeric value between 0 and 1. A 0.5 probability is commonly expressed as a "50% chance" of something happening. Hence, a PageRank of 0.5 means there is a 50% chance that a person clicking on a random link will be directed to the document with the 0.5 PageRank.Simplified algorithm[edit]Assume a small universe of four web pages: A, B, C and D. Links from a page to itself are ignored. Multiple outbound links from one page to another page are treated as a single link. PageRank is initialized to the same value for all pages. In the original form of PageRank, the sum of PageRank over all pages was the total number of pages on the web at that time, so each page in this example would have an initial value of 1. However, later versions of PageRank, and the remainder of this section, assume a probability distribution between 0 and 1. Hence the initial value for each page in this example is 0.25.The PageRank transferred from a given page to the targets of its outbound links upon the next iteration is divided equally among all outbound links.If the only links in the system were from pages B, C, and D to A, each link would transfer 0.25 PageRank to A upon the next iteration, for a total of 0.75.[math]{\displaystyle PR(A)=PR(B)+PR(C)+PR(D).\,}[/math]Suppose instead that page B had a link to pages C and A, page C had a link to page A, and page D had links to all three pages. Thus, upon the first iteration, page B would transfer half of its existing value, or 0.125, to page A and the other half, or 0.125, to page C. Page C would transfer all of its existing value, 0.25, to the only page it links to, A. Since D had three outbound links, it would transfer one third of its existing value, or approximately 0.083, to A. At the completion of this iteration, page A will have a PageRank of approximately 0.458.[math]{\displaystyle PR(A)={\frac {PR(B)}{2}}+{\frac {PR(C)}{1}}+{\frac {PR(D)}{3}}.\,}[/math]In other words, the PageRank conferred by an outbound link is equal to the document's own PageRank score divided by the number of outbound links L( ).[math]{\displaystyle PR(A)={\frac {PR(B)}{L(B)}}+{\frac {PR(C)}{L(C)}}+{\frac {PR(D)}{L(D)}}.\,}[/math]In the general case, the PageRank value for any page u can be expressed as:[math]{\displaystyle PR(u)=\sum _{v\in B_{u}}{\frac {PR(v)}{L(v)}}}[/math],i.e. the PageRank value for a page u is dependent on the PageRank values for each page v contained in the set Bu(the set containing all pages linking to page u), divided by the number L(v) of links from page v.Damping factor[edit]The PageRank theory holds that an imaginary surfer who is randomly clicking on links will eventually stop clicking. The probability, at any step, that the person will continue is a damping factor d. Various studies have tested different damping factors, but it is generally assumed that the damping factor will be set around 0.85.[5]The damping factor is subtracted from 1 (and in some variations of the algorithm, the result is divided by the number of documents (N) in the collection) and this term is then added to the product of the damping factor and the sum of the incoming PageRank scores. That is,[math]{\displaystyle PR(A)={1-d \over N}+d\left({\frac {PR(B)}{L(B)}}+{\frac {PR(C)}{L(C)}}+{\frac {PR(D)}{L(D)}}+\,\cdots \right).}[/math]So any page's PageRank is derived in large part from the PageRanks of other pages. The damping factor adjusts the derived value downward. The original paper, however, gave the following formula, which has led to some confusion:[math]{\displaystyle PR(A)=1-d+d\left({\frac {PR(B)}{L(B)}}+{\frac {PR(C)}{L(C)}}+{\frac {PR(D)}{L(D)}}+\,\cdots \right).}[/math]The difference between them is that the PageRank values in the first formula sum to one, while in the second formula each PageRank is multiplied by N and the sum becomes N. A statement in Page and Brin's paper that "the sum of all PageRanks is one"[5]and claims by other Google employees[23]support the first variant of the formula above.Page and Brin confused the two formulas in their most popular paper "The Anatomy of a Large-Scale Hypertextual Web Search Engine", where they mistakenly claimed that the latter formula formed a probability distribution over web pages.[5]Google recalculates PageRank scores each time it crawls the Web and rebuilds its index. As Google increases the number of documents in its collection, the initial approximation of PageRank decreases for all documents.The formula uses a model of a random surfer who gets bored after several clicks and switches to a random page. The PageRank value of a page reflects the chance that the random surfer will land on that page by clicking on a link. It can be understood as a Markov chain in which the states are pages, and the transitions, which are all equally probable, are the links between pages.If a page has no links to other pages, it becomes a sink and therefore terminates the random surfing process. If the random surfer arrives at a sink page, it picks another URL at random and continues surfing again.When calculating PageRank, pages with no outbound links are assumed to link out to all other pages in the collection. Their PageRank scores are therefore divided evenly among all other pages. In other words, to be fair with pages that are not sinks, these random transitions are added to all nodes in the Web, with a residual probability usually set to d = 0.85, estimated from the frequency that an average surfer uses his or her browser's bookmark feature.So, the equation is as follows:[math]{\displaystyle PR(p_{i})={\frac {1-d}{N}}+d\sum _{p_{j}\in M(p_{i})}{\frac {PR(p_{j})}{L(p_{j})}}}[/math]where [math]{\displaystyle p_{1},p_{2},...,p_{N}}[/math] are the pages under consideration, [math]{\displaystyle M(p_{i})}[/math] is the set of pages that link to [math]{\displaystyle p_{i}}[/math], [math]{\displaystyle L(p_{j})}[/math] is the number of outbound links on page [math]{\displaystyle p_{j}}[/math], and [math]{\displaystyle N}[/math] is the total number of pages.The PageRank values are the entries of the dominant right eigenvector of the modified adjacency matrix rescaled so that each column adds up to one. This makes PageRank a particularly elegant metric: the eigenvector is[math]{\displaystyle \mathbf {R} ={\begin{bmatrix}PR(p_{1})\\PR(p_{2})\\\vdots \\PR(p_{N})\end{bmatrix}}}[/math]where R is the solution of the equation[math]{\displaystyle \mathbf {R} ={\begin{bmatrix}{(1-d)/N}\\{(1-d)/N}\\\vdots \\{(1-d)/N}\end{bmatrix}}+d{\begin{bmatrix}\ell (p_{1},p_{1})&\ell (p_{1},p_{2})&\cdots &\ell (p_{1},p_{N})\\\ell (p_{2},p_{1})&\ddots &&\vdots \\\vdots &&\ell (p_{i},p_{j})&\\\ell (p_{N},p_{1})&\cdots &&\ell (p_{N},p_{N})\end{bmatrix}}\mathbf {R} }[/math]where the adjacency function [math]{\displaystyle \ell (p_{i},p_{j})}[/math] is the ratio between number of links outbound from page j to page i to the total number of outbound links of page j. And link is 0 if page [math]{\displaystyle p_{j}}[/math]does not link to [math]{\displaystyle p_{i}}[/math], and normalized such that, for each j[math]{\displaystyle \sum _{i=1}^{N}\ell (p_{i},p_{j})=1}[/math],i.e. the elements of each column sum up to 1, so the matrix is a stochastic matrix (for more details see the computation section below). Thus this is a variant of the eigenvector centrality measure used commonly in network analysis.Because of the large eigengap of the modified adjacency matrix above,[24]the values of the PageRank eigenvector can be approximated to within a high degree of accuracy within only a few iterations.Google's founders, in their original paper,[17]reported that the PageRank algorithm for a network consisting of 322 million links (in-edges and out-edges) converges to within a tolerable limit in 52 iterations. The convergence in a network of half the above size took approximately 45 iterations. Through this data, they concluded the algorithm can be scaled very well and that the scaling factor for extremely large networks would be roughly linear in [math]{\displaystyle \log n}[/math], where n is the size of the network.As a result of Markov theory, it can be shown that the PageRank of a page is the probability of arriving at that page after a large number of clicks. This happens to equal [math]{\displaystyle t^{-1}}[/math] where [math]{\displaystyle t}[/math]is the expectation of the number of clicks (or random jumps) required to get from the page back to itself.One main disadvantage of PageRank is that it favors older pages. A new page, even a very good one, will not have many links unless it is part of an existing site (a site being a densely connected set of pages, such as Wikipedia).Several strategies have been proposed to accelerate the computation of PageRank.[25]Various strategies to manipulate PageRank have been employed in concerted efforts to improve search results rankings and monetize advertising links. These strategies have severely impacted the reliability of the PageRank concept,[citation needed]which purports to determine which documents are actually highly valued by the Web community.Since December 2007, when it started actively penalizing sites selling paid text links, Google has combatted link farms and other schemes designed to artificially inflate PageRank. How Google identifies link farms and other PageRank manipulation tools is among Google's trade secrets.Computation[edit]PageRank can be computed either iteratively or algebraically. The iterative method can be viewed as the power iteration method[26][27]or the power method. The basic mathematical operations performed are identical.Iterative[edit]At [math]{\displaystyle t=0}[/math], an initial probability distribution is assumed, usually[math]{\displaystyle PR(p_{i};0)={\frac {1}{N}}}[/math].At each time step, the computation, as detailed above, yields[math]{\displaystyle PR(p_{i};t+1)={\frac {1-d}{N}}+d\sum _{p_{j}\in M(p_{i})}{\frac {PR(p_{j};t)}{L(p_{j})}}}[/math],or in matrix notation[math]{\displaystyle \mathbf {R} (t+1)=d{\mathcal {M}}\mathbf {R} (t)+{\frac {1-d}{N}}\mathbf {1} }[/math], (*)where [math]{\displaystyle \mathbf {R} _{i}(t)=PR(p_{i};t)}[/math] and [math]{\displaystyle \mathbf {1} }[/math] is the column vector of length [math]{\displaystyle N}[/math] containing only ones.The matrix [math]{\displaystyle {\mathcal {M}}}[/math] is defined as[math]{\displaystyle {\mathcal {M}}_{ij}={\begin{cases}1/L(p_{j}),&{\mbox{if }}j{\mbox{ links to }}i\ \\0,&{\mbox{otherwise}}\end{cases}}}[/math]i.e.,[math]{\displaystyle {\mathcal {M}}:=(K^{-1}A)^{T}}[/math],where [math]{\displaystyle A}[/math] denotes the adjacency matrix of the graph and [math]{\displaystyle K}[/math] is the diagonal matrix with the outdegrees in the diagonal.The computation ends when for some small [math]{\displaystyle \epsilon }[/math][math]{\displaystyle |\mathbf {R} (t+1)-\mathbf {R} (t)|<\epsilon }[/math],i.e., when convergence is assumed.Algebraic[edit]—For [math]{\displaystyle t\to \infty }[/math] (i.e., in the steady state), the above equation (*) reads[math]{\displaystyle \mathbf {R} =d{\mathcal {M}}\mathbf {R} +{\frac {1-d}{N}}\mathbf {1} }[/math]. (**)The solution is given by[math]{\displaystyle \mathbf {R} =(\mathbf {I} -d{\mathcal {M}})^{-1}{\frac {1-d}{N}}\mathbf {1} }[/math],with the identity matrix [math]{\displaystyle \mathbf {I} }[/math].The solution exists and is unique for [math]{\displaystyle 0<d<1}[/math]. This can be seen by noting that [math]{\displaystyle {\mathcal {M}}}[/math] is by construction a stochastic matrix and hence has an eigenvalue equal to one as a consequence of the Perron–Frobenius theorem.Power Method[edit]If the matrix [math]{\displaystyle {\mathcal {M}}}[/math] is a transition probability, i.e., column-stochastic and [math]{\displaystyle \mathbf {R} }[/math] is a probability distribution (i.e., [math]{\displaystyle |\mathbf {R} |=1}[/math], [math]{\displaystyle \mathbf {E} \mathbf {R} =\mathbf {1} }[/math] where [math]{\displaystyle \mathbf {E} }[/math] is matrix of all ones), Eq. (**) is equivalent to[math]{\displaystyle \mathbf {R} =\left(d{\mathcal {M}}+{\frac {1-d}{N}}\mathbf {E} \right)\mathbf {R} =:{\widehat {\mathcal {M}}}\mathbf {R} }[/math]. (***)Hence PageRank [math]{\displaystyle \mathbf {R} }[/math] is the principal eigenvector of [math]{\displaystyle {\widehat {\mathcal {M}}}}[/math]. A fast and easy way to compute this is using the power method: starting with an arbitrary vector [math]{\displaystyle x(0)}[/math], the operator [math]{\displaystyle {\widehat {\mathcal {M}}}}[/math] is applied in succession, i.e.,[math]{\displaystyle x(t+1)={\widehat {\mathcal {M}}}x(t)}[/math],until[math]{\displaystyle |x(t+1)-x(t)|<\epsilon }[/math].Note that in Eq. (***) the matrix on the right-hand side in the parenthesis can be interpreted as[math]{\displaystyle {\frac {1-d}{N}}\mathbf {E} =(1-d)\mathbf {P} \mathbf {1} ^{t}}[/math],where [math]{\displaystyle \mathbf {P} }[/math] is an initial probability distribution. In the current case[math]{\displaystyle \mathbf {P} :={\frac {1}{N}}\mathbf {1} }[/math].Finally, if [math]{\displaystyle {\mathcal {M}}}[/math] has columns with only zero values, they should be replaced with the initial probability vector [math]{\displaystyle \mathbf {P} }[/math]. In other words,[math]{\displaystyle {\mathcal {M}}^{\prime }:={\mathcal {M}}+{\mathcal {D}}}[/math],where the matrix [math]{\displaystyle {\mathcal {D}}}[/math] is defined as[math]{\displaystyle {\mathcal {D}}:=\mathbf {P} \mathbf {D} ^{t}}[/math],with[math]{\displaystyle \mathbf {D} _{i}={\begin{cases}1,&{\mbox{if }}L(p_{i})=0\ \\0,&{\mbox{otherwise}}\end{cases}}}[/math]In this case, the above two computations using [math]{\displaystyle {\mathcal {M}}}[/math] only give the same PageRank if their results are normalized:[math]{\displaystyle \mathbf {R} _{\textrm {power}}={\frac {\mathbf {R} _{\textrm {iterative}}}{|\mathbf {R} _{\textrm {iterative}}|}}={\frac {\mathbf {R} _{\textrm {algebraic}}}{|\mathbf {R} _{\textrm {algebraic}}|}}}[/math].PageRank MATLAB/Octave implementation% Parameter M adjacency matrix where M_i,j represents the link from 'j' to 'i', such that for all 'j' % sum(i, M_i,j) = 1 % Parameter d damping factor % Parameter v_quadratic_error quadratic error for v % Return v, a vector of ranks such that v_i is the i-th rank from [0, 1]  function [v] = rank2(M, d, v_quadratic_error)  N = size(M, 2); % N is equal to either dimension of M and the number of documents v = rand(N, 1); v = v ./ norm(v, 1); % This is now L1, not L2 last_v = ones(N, 1) * inf; M_hat = (d .* M) + (((1 - d) / N) .* ones(N, N));  while(norm(v - last_v, 2) > v_quadratic_error)  last_v = v;  v = M_hat * v;  % removed the L2 norm of the iterated PR end  end %function Example of code calling the rank function defined above:M = [0 0 0 0 1 ; 0.5 0 0 0 0 ; 0.5 0 0 0 0 ; 0 1 0.5 0 0 ; 0 0 0.5 1 0]; rank2(M, 0.80, 0.001) This example takes 13 iterations to converge.Variations[edit]PageRank of an undirected graph[edit]The PageRank of an undirected graph G is statistically close to the degree distribution of the graph G,[28]but they are generally not identical: If R is the PageRank vector defined above, and D is the degree distribution vector[math]{\displaystyle D={1 \over 2|E|}{\begin{bmatrix}deg(p_{1})\\deg(p_{2})\\\vdots \\deg(p_{N})\end{bmatrix}}}[/math]where [math]{\displaystyle deg(p_{i})}[/math] denotes the degree of vertex [math]{\displaystyle p_{i}}[/math], and E is the edge-set of the graph, then, with [math]{\displaystyle Y={1 \over N}\mathbf {1} }[/math], by:[29][math]{\displaystyle {1-d \over 1+d}\|Y-D\|_{1}\leq \|R-D\|_{1}\leq \|Y-D\|_{1},}[/math]that is, the PageRank of an undirected graph equals to the degree distribution vector if and only if the graph is regular, i.e., every vertex has the same degree.Generalization of PageRank and eigenvector centrality for ranking objects of two kinds[edit]A generalization of PageRank for the case of ranking two interacting groups of objects was described in[30]In applications it may be necessary to model systems having objects of two kinds where a weighted relation is defined on object pairs. This leads to considering bipartite graphs. For such graphs two related positive or nonnegative irreducible matrices corresponding to vertex partition sets can be defined. One can compute rankings of objects in both groups as eigenvectors corresponding to the maximal positive eigenvalues of these matrices. Normed eigenvectors exist and are unique by the Perron or Perron-Frobenius theorem. Example: consumers and products. the relation weight is the product consumption rate.Distributed algorithm for PageRank computation[edit]There are simple and fast random walk-based distributed algorithms for computing PageRank of nodes in a network.[31]They present a simple algorithm that takes [math]{\displaystyle O(\log n/\epsilon )}[/math]rounds with high probability on any graph (directed or undirected), where n is the network size and [math]{\displaystyle \epsilon }[/math] is the reset probability ([math]{\displaystyle 1-\epsilon }[/math] is also called as damping factor) used in the PageRank computation. They also present a faster algorithm that takes [math]{\displaystyle O({\sqrt {\log n}}/\epsilon )}[/math] rounds in undirected graphs. Both of the above algorithms are scalable, as each node processes and sends only small (polylogarithmic in n, the network size) number of bits per round.Google Toolbar[edit]The Google Toolbar long had a PageRank feature which displayed a visited page's PageRank as a whole number between 0 and 10. The most popular websites displayed a PageRank of 10. The least showed a PageRank of 0. Google has not disclosed the specific method for determining a Toolbar PageRank value, which is to be considered only a rough indication of the value of a website. In March 2016 Google announced it would no longer support this feature, and the underlying API would soon cease to operate.[32]SERP rank[edit]The search engine results page (SERP) is the actual result returned by a search engine in response to a keyword query. The SERP consists of a list of links to web pages with associated text snippets. The SERP rank of a web page refers to the placement of the corresponding link on the SERP, where higher placement means higher SERP rank. The SERP rank of a web page is a function not only of its PageRank, but of a relatively large and continuously adjusted set of factors (over 200).[33]Search engine optimization (SEO) is aimed at influencing the SERP rank for a website or a set of web pages.Positioning of a webpage on Google SERPs for a keyword depends on relevance and reputation, also known as authority and popularity. PageRank is Google’s indication of its assessment of the reputation of a webpage: It is non-keyword specific. Google uses a combination of webpage and website authority to determine the overall authority of a webpage competing for a keyword.[34]The PageRank of the HomePage of a website is the best indication Google offers for website authority.[35]After the introduction of Google Places into the mainstream organic SERP, numerous other factors in addition to PageRank affect ranking a business in Local Business Results.[36]Google directory PageRank[edit]The Google Directory PageRank was an 8-unit measurement. Unlike the Google Toolbar, which shows a numeric PageRank value upon mouseover of the green bar, the Google Directory only displayed the bar, never the numeric values. Google Directory was closed on July 20, 2011.[37]False or spoofed PageRank[edit]In the past, the PageRank shown in the Toolbar was easily manipulated. Redirection from one page to another, either via a HTTP 302 response or a "Refresh" meta tag, caused the source page to acquire the PageRank of the destination page. Hence, a new page with PR 0 and no incoming links could have acquired PR 10 by redirecting to the Google home page. This spoofing technique was a known vulnerability. Spoofing can generally be detected by performing a Google search for a source URL; if the URL of an entirely different site is displayed in the results, the latter URL may represent the destination of a redirection.Manipulating PageRank[edit]For search engine optimization purposes, some companies offer to sell high PageRank links to webmasters.[38]As links from higher-PR pages are believed to be more valuable, they tend to be more expensive. It can be an effective and viable marketing strategy to buy link advertisements on content pages of quality and relevant sites to drive traffic and increase a webmaster's link popularity. However, Google has publicly warned webmasters that if they are or were discovered to be selling links for the purpose of conferring PageRank and reputation, their links will be devalued (ignored in the calculation of other pages' PageRanks). The practice of buying and selling links is intensely debated across the Webmaster community. Google advises webmasters to use the nofollow HTML attribute value on sponsored links. According to Matt Cutts, Google is concerned about webmasters who try to game the system, and thereby reduce the quality and relevance of Google search results.[38]Directed Surfer Model[edit]A more intelligent surfer that probabilistically hops from page to page depending on the content of the pages and query terms the surfer that it is looking for. This model is based on a query-dependent PageRank score of a page which as the name suggests is also a function of query. When given a multiple-term query, Q={q1,q2,…}, the surfer selects a q according to some probability distribution, P(q) and uses that term to guide its behavior for a large number of steps. It then selects another term according to the distribution to determine its behavior, and so on. The resulting distribution over visited web pages is QD-PageRank.[39]Social components[edit]The PageRank algorithm has major effects on society as it contains a social influence. As opposed to the scientific viewpoint of PageRank as an algorithm the humanities instead view it through a lens examining its social components. In these instances, it is dissected and reviewed not for its technological advancement in the field of search engines, but for its societal influences.[40]Laura Granka discusses PageRank by describing how the pages are not simply ranked via popularity as they contain a reliability that gives them a trustworthy quality. This has led to a development of behavior that is directly linked to PageRank. PageRank is viewed as the definitive rank of products and businesses and thus, can manipulate thinking. The information that is available to individuals is what shapes thinking and ideology and PageRank is the device that displays this information. The results shown are the forum to which information is delivered to the public and these results have a societal impact as they will affect how a person thinks and acts.[41]Katja Mayer views PageRank as a social network as it connects differing viewpoints and thoughts in a single place. People go to PageRank for information and are flooded with citations of other authors who also have an opinion on the topic. This creates a social aspect where everything can be discussed and collected to provoke thinking. There is a social relationship that exists between PageRank and the people who use it as it is constantly adapting and changing to the shifts in modern society. Viewing the relationship between PageRank and the individual through sociometry allows for an in-depth look at the connection that results.[42]Matteo Pasquinelli reckons the basis for the belief that PageRank has a social component lies in the idea of attention economy. With attention economy, value is placed on products that receive a greater amount of human attention and the results at the top of the PageRank garner a larger amount of focus then those on subsequent pages. The outcomes with the higher PageRank will therefore enter the human consciousness to a larger extent. These ideas can influence decision-making and the actions of the viewer have a direct relation to the PageRank. They possess a higher potential to attract a user’s attention as their location increases the attention economy attached to the site. With this location they can receive more traffic and their online marketplace will have more purchases. The PageRank of these sites allow them to be trusted and they are able to parlay this trust into increased business.Other uses[edit]The mathematics of PageRank are entirely general and apply to any graph or network in any domain. Thus, PageRank is now regularly used in bibliometrics, social and information network analysis, and for link prediction and recommendation. It's even used for systems analysis of road networks, as well as biology, chemistry, neuroscience, and physics.[43]In neuroscience, the PageRank of a neuron in a neural network has been found to correlate with its relative firing rate.[44]Personalized PageRank is used by Twitter to present users with other accounts they may wish to follow.[45]Swiftype's site search product builds a "PageRank that’s specific to individual websites" by looking at each website's signals of importance and prioritizing content based on factors such as number of links from the home page.[46]A version of PageRank has recently been proposed as a replacement for the traditional Institute for Scientific Information (ISI) impact factor,[47]and implemented at Eigenfactor as well as at SCImago. Instead of merely counting total citation to a journal, the "importance" of each citation is determined in a PageRank fashion.A similar new use of PageRank is to rank academic doctoral programs based on their records of placing their graduates in faculty positions. In PageRank terms, academic departments link to each other by hiring their faculty from each other (and from themselves).[48]PageRank has been used to rank spaces or streets to predict how many people (pedestrians or vehicles) come to the individual spaces or streets.[49][50]In lexical semantics it has been used to perform Word Sense Disambiguation,[51]Semantic similarity,[52]and also to automatically rank WordNet synsets according to how strongly they possess a given semantic property, such as positivity or negativity.[53]A Web crawler may use PageRank as one of a number of importance metrics it uses to determine which URL to visit during a crawl of the web. One of the early working papers[54]that were used in the creation of Google is Efficient crawling through URL ordering,[55]which discusses the use of a number of different importance metrics to determine how deeply, and how much of a site Google will crawl. PageRank is presented as one of a number of these importance metrics, though there are others listed such as the number of inbound and outbound links for a URL, and the distance from the root directory on a site to the URL.The PageRank may also be used as a methodology to measure the apparent impact of a community like the Blogosphere on the overall Web itself. This approach uses therefore the PageRank to measure the distribution of attention in reflection of the Scale-free network paradigm.[citation needed]In any ecosystem, a modified version of PageRank may be used to determine species that are essential to the continuing health of the environment.[56]For the analysis of protein networks in biology PageRank is also a useful tool.[57][58]In 2005, in a pilot study in Pakistan, Structural Deep Democracy, SD2[59][60]was used for leadership selection in a sustainable agriculture group called Contact Youth. SD2 uses PageRank for the processing of the transitive proxy votes, with the additional constraints of mandating at least two initial proxies per voter, and all voters are proxy candidates. More complex variants can be built on top of SD2, such as adding specialist proxies and direct votes for specific issues, but SD2 as the underlying umbrella system, mandates that generalist proxies should always be used.Pagerank has recently been used to quantify the scientific impact of researchers. The underlying citation and collaboration networks are used in conjunction with pagerank algorithm in order to come up with a ranking system for individual publications which propagates to individual authors. The new index known as pagerank-index (Pi) is demonstrated to be fairer compared to h-index in the context of many drawbacks exhibited by h-index.[61]nofollow[edit]In early 2005, Google implemented a new value, "nofollow",[62]for the rel attribute of HTML link and anchor elements, so that website developers and bloggers can make links that Google will not consider for the purposes of PageRank—they are links that no longer constitute a "vote" in the PageRank system. The nofollow relationship was added in an attempt to help combat spamdexing.As an example, people could previously create many message-board posts with links to their website to artificially inflate their PageRank. With the nofollow value, message-board administrators can modify their code to automatically insert "rel='nofollow'" to all hyperlinks in posts, thus preventing PageRank from being affected by those particular posts. This method of avoidance, however, also has various drawbacks, such as reducing the link value of legitimate comments. (See: Spam in blogs#nofollow)In an effort to manually control the flow of PageRank among pages within a website, many webmasters practice what is known as PageRank Sculpting[63]—which is the act of strategically placing the nofollow attribute on certain internal links of a website in order to funnel PageRank towards those pages the webmaster deemed most important. This tactic has been used since the inception of the nofollow attribute, but may no longer be effective since Google announced that blocking PageRank transfer with nofollow does not redirect that PageRank to other links.[64]Deprecation[edit]PageRank was once available for the verified site maintainers through the Google Webmaster Tools interface. However, on October 15, 2009, a Google employee confirmed that the company had removed PageRank from its Webmaster Tools section, saying that "We've been telling people for a long time that they shouldn't focus on PageRank so much. Many site owners seem to think it's the most important metric for them to track, which is simply not true."[65]In addition, The PageRank indicator is not available in Google's own Chromebrowser.The visible page rank is updated very infrequently. It was last updated in November 2013. In October 2014 Matt Cutts announced that another visible pagerank update would not be coming.[66]Even though "Toolbar" PageRank is less important for SEO purposes, the existence of back-links from more popular websites continues to push a webpage higher up in search rankings.[67]Moz, an online marketing and SEO firm, offers a system similar to PageRank under the name "MozRank."Google elaborated on the reasons for PageRank deprecation at Q&A #March and announced Links and Content as the Top Ranking Factors, RankBrain was announced as the #3 Ranking Factor in October 2015 so the Top 3 Factors are now confirmed officially by Google.[68]On April 15, 2016 Google has officially shut down their Google Toolbar PageRank Data to public. Google had declared their intention to remove the PageRank score from the Google toolbar several months earlier.[69]Google will still be using PageRank score when determining how to rank content in search results.[70]See also[edit]Domain AuthorityEigenTrust — a decentralized PageRank algorithmGoogle bombGoogle SearchGoogle matrixGoogle PandaVisualRank - Google's application of PageRank to image-searchHilltop algorithmKatz centrality – a 1953 scheme closely related to pagerankLink loveMethods of website linkingPower method — the iterative eigenvector algorithm used to calculate PageRankSearch engine optimizationSimRank — a measure of object-to-object similarity based on random-surfer modelTopic-Sensitive PageRankTrustRankWebgraphCheiRankGoogle PenguinGoogle HummingbirdReferences[edit]Citations[edit]Jump up^ "Google Press Center: Fun Facts". Google. Archived from the original on 2001-07-15.Jump up^ "Facts about Google and Competition". Archived from the original on 4 November 2011. Retrieved 12 July 2014.Jump up^ Sullivan, Danny. "What Is Google PageRank? A Guide For Searchers & Webmasters". Search Engine Land. Archived from the original on 2016-07-03.Jump up^ Cutts, Matt. "Algorithms Rank Relevant Results Higher". Google. Archived from the original on July 2, 2013. Retrieved 19 October 2015.^ Jump up to:a b c d e f Brin, S.; Page, L. (1998). "The anatomy of a large-scale hypertextual Web search engine" (PDF). Computer Networks and ISDN Systems. 30: 107–117. doi:10.1016/S0169-7552(98)00110-X. ISSN 0169-7552. Archived (PDF) from the original on 2015-09-27.Jump up^ Gyöngyi, Zoltán; Berkhin, Pavel; Garcia-Molina, Hector; Pedersen, Jan (2006), "Link spam detection based on mass estimation", Proceedings of the 32nd International Conference on Very Large Data Bases (VLDB '06, Seoul, Korea) (PDF), pp. 439–450, archived (PDF)from the original on 2014-12-03.Jump up^ Gabriel Pinski & Francis Narin. "Citation influence for journal aggregates of scientific publications: Theory, with application to the literature of physics". Information Processing & Management. 12 (5): 297–312. doi:10.1016/0306-4573(76)90048-0.Jump up^ Thomas Saaty (1977). "A scaling method for priorities in hierarchical structures". Journal of Mathematical Psychology. 15 (3): 234–281. doi:10.1016/0022-2496(77)90033-5.Jump up^ Bradley C. Love & Steven A. Sloman. "Mutability and the determinants of conceptual transformability" (PDF). Proceedings of the Seventeenth Annual Conference of the Cognitive Science Society: 654–659.Jump up^ "How a CogSci undergrad invented PageRank three years before Google". Bradley C. Love. Archived from the original on 2017-12-11. Retrieved 2017-12-23.Jump up^ Page, Larry, ""PageRank: Bringing Order to the Web"". Archived from the original on May 6, 2002. Retrieved 2016-09-11., Stanford Digital Library Project, talk. August 18, 1997 (archived 2002)Jump up^ 187-page study from Graz University, Austria Archived 2014-01-16 at the Wayback Machine., includes the note that also human brains are used when determining the page rank in Google

View Our Customer Reviews

Impressive response time. I had multiple occasions to interact with support team (each time different employee), without any frustrations. PS: Problems that I have/had related to mainly small issues (which should still be addressed). Products itself are great (price/value). They just fail on full experience.

Justin Miller