The Guide of finishing Assumption Of Risk And General Release Form - Harvard University Online
If you are curious about Edit and create a Assumption Of Risk And General Release Form - Harvard University, heare are the steps you need to follow:
- Hit the "Get Form" Button on this page.
- Wait in a petient way for the upload of your Assumption Of Risk And General Release Form - Harvard University.
- You can erase, text, sign or highlight of your choice.
- Click "Download" to download the changes.
A Revolutionary Tool to Edit and Create Assumption Of Risk And General Release Form - Harvard University


Edit or Convert Your Assumption Of Risk And General Release Form - Harvard University in Minutes
Get FormHow to Easily Edit Assumption Of Risk And General Release Form - Harvard University Online
CocoDoc has made it easier for people to Customize their important documents across online browser. They can easily Fill as what they want. To know the process of editing PDF document or application across the online platform, you need to follow these steps:
- Open the official website of CocoDoc on their device's browser.
- Hit "Edit PDF Online" button and Choose the PDF file from the device without even logging in through an account.
- Edit the PDF online by using this toolbar.
- Once done, they can save the document from the platform.
Once the document is edited using online website, the user can easily export the document through your choice. CocoDoc ensures the high-security and smooth environment for accomplishing the PDF documents.
How to Edit and Download Assumption Of Risk And General Release Form - Harvard University on Windows
Windows users are very common throughout the world. They have met millions of applications that have offered them services in editing PDF documents. However, they have always missed an important feature within these applications. CocoDoc intends to offer Windows users the ultimate experience of editing their documents across their online interface.
The way of editing a PDF document with CocoDoc is very simple. You need to follow these steps.
- Choose and Install CocoDoc from your Windows Store.
- Open the software to Select the PDF file from your Windows device and proceed toward editing the document.
- Customize the PDF file with the appropriate toolkit appeared at CocoDoc.
- Over completion, Hit "Download" to conserve the changes.
A Guide of Editing Assumption Of Risk And General Release Form - Harvard University on Mac
CocoDoc has brought an impressive solution for people who own a Mac. It has allowed them to have their documents edited quickly. Mac users can make a PDF fillable online for free with the help of the online platform provided by CocoDoc.
In order to learn the process of editing form with CocoDoc, you should look across the steps presented as follows:
- Install CocoDoc on you Mac firstly.
- Once the tool is opened, the user can upload their PDF file from the Mac with ease.
- Drag and Drop the file, or choose file by mouse-clicking "Choose File" button and start editing.
- save the file on your device.
Mac users can export their resulting files in various ways. Not only downloading and adding to cloud storage, but also sharing via email are also allowed by using CocoDoc.. They are provided with the opportunity of editting file through multiple ways without downloading any tool within their device.
A Guide of Editing Assumption Of Risk And General Release Form - Harvard University on G Suite
Google Workplace is a powerful platform that has connected officials of a single workplace in a unique manner. While allowing users to share file across the platform, they are interconnected in covering all major tasks that can be carried out within a physical workplace.
follow the steps to eidt Assumption Of Risk And General Release Form - Harvard University on G Suite
- move toward Google Workspace Marketplace and Install CocoDoc add-on.
- Select the file and tab on "Open with" in Google Drive.
- Moving forward to edit the document with the CocoDoc present in the PDF editing window.
- When the file is edited completely, share it through the platform.
PDF Editor FAQ
Would a nuclear war truly end the world, or is it just fear mongering?
It is just fearmongering … the effects are drastically overstated … The world isn't armed anymore with anything near enough to even cause the fall of a major nation. One funny observation I have made is that most people don't want to believe the truth. They are happier believing in the apocalypse, practically defending the end of the world like it is some earned right. Don’t let yourself be brainwashed by Hollywood and extreme political rhetoric. It isn’t hard…. think for yourself.the following is from another answer of mine. The footnotes will take you there for references.Nuclear war in the 21st century would not be the end of Russia or the USA and certainly not mankindMany of us can remember a time that the threat of MAD or Mutually Assured Destruction was a very real threat. A time where you felt helpless to change the events that constantly seemed to be unraveling a half a world away. Everyone's fate was endlessly hanging on a thread an at any moment, by mistake or intent, we could all be dead in 30 minutes… or less.In the 1970’s and 1980’s the world’s nuclear arsenals grew to an unfathomable size both in quantities and in destructive force. There was no doubt that the world was facing Armageddon. It was an end that would end all of mankind and a post apocalyptic vision of hell on Earth for those who were unlucky enough to survive. We had boxed ourselves into a corner where global war was unthinkable.In 2017, a lot has changed……..The destructive force of all the world's nuclear weapons is a fraction of what it once was. Surprisingly quietly, the USA and Russia have dismantled over 50,000 nuclear weapons over the past 30 years. The nuclear materials from these bombs and other stockpiles of weapons grade materials, was recycled and used in nuclear power generation over the past 20 years. [1] A fact that few may be aware of, the situation actually crashed the uranium market in the early 2000’s. The glut of available fuel brought the open market trading value down from $20 dollars a pound to near $2 per pound at that time. So a lot has changed from the time when many of us can remember the very real threat of mutually assured destruction.Reductions in the US ArsenalSince the end of the Cold War, the United States has eliminated entire classes of nuclear weapons, for example, the Army’s nuclear artillery and tactical missiles, and the Navy’s tactical nuclear weapons on surface ships.After a dramatic build-up to more than 32,000 warheads by 1966, the trend since then has been, with a few bumps and plateaus, consistently downward. While the numbers declined by only one quarter over the next twenty years, the types of warheads in the stockpile changed dramatically, with strategic warheads increasing and tactical warheads decreasing.Of the 32,000 warheads in 1967, approximately one-third were strategic and the balance tactical. Of the 23,500 warheads in 1987, almost two-thirds were strategic and the balance tactical. Between 1987 and 1996, more than 13,300 weapons were retired leaving approximately 10,500 warheads in the stockpile. President George H.W. Bush cut the stockpile in half to the 10,000-11,000 level, by treaty agreements and unilateral actions as the Cold War ended. His son, President George W. Bush cut it in half again in the 2002-2008 period. [2]Multi Megaton Weapons Now ObsoleteWhat has changed that the world no longer is building megaton weapons? The need for multi-megaton weapons was the result of low accuracy of warhead delivery on target…. we needed a sledgehammer approach to take out hardened targets and the way that was done was through very high yield bombs >=5 MT typically. The average nuclear weapon size today in 2017 is about 443 KT at full yield but a large portion of those bombs can be adjusted in the field to a very small fraction of their potential yield.Today the accuracy of on target delivery has improved significantly ..we hit what we aim for. Making a weapon twice as accurate has the same effect on lethality as making the warhead eight times as powerful. Phrased another way, making the missile twice as precise would only require one-eighth the explosive power to maintain the same lethality. [3] This means we need less hammer to do the same job. In the 1980’s the development of earth penetrating rounds was another game changer. Not only were we on target but now we could penetrate hundreds of feet of earth and concrete before detonating the warhead. This allowed a 100 KT weapon to do the damage of a >1 MT surface detonation. This is the primary method now for targeting hardened targets and is the final driver for smaller yield bombs. (Note that conventional warheads can penetrate 100’s of feet through concrete, nuclear warheads are usually limited to less than 30 ft in concrete due to the complexity of the warhead not surviving further penetration. Soil penetration depth varies with composition but to get full ground coupling of energy only 4 or 5 meters is required)The net effect of the use of EPW’s (Earth Penetrating Weapons) is a reduction in the number of casualties as compared with the number of casualties from a surface burst. This is primarily due to a 96% reduction in the weapon yield needed using an EPW. The greater coupling of the released energy to the ground shock for a buried detonation is the same as a surface burst with 25 times the explosive energy. For rural targets, the use of a nuclear earth-penetrator weapon is estimated to reduce casualties by a factor of 10 to 100 relative to a nuclear surface burst of equivalent probability of damage.[4]To exploit that efficiency, in 1997 the US replaced its aging 9-megaton bombs with a lower-yield but earth-penetrating 300-kt model by putting the nuclear warhead from an earlier bomb design into a strengthened alloy-steel casing and a new nose cone. [5]Pictured above is the first US precision guided EPW tactical nuclear gravity bomb. The B61-12 is designed to have four selectable explosive yields: 0.3 kilotons (kt), 1.5 kt, 10 kt and 50 kt. The B61-12 will be integrated on virtually all nuclear-capable U.S. and NATO aircraft: B-2, LRS-B (next-generation long-range bomber), F-35A, F-16, F-15E, and PA-200 Tornado.[6]Continued from above……To fully appreciate this evolution consider a targeting scenario as it was in the 1970’s as compared to 2020. In the 1970′s a hardened silo target required multiple megaton bombs to destroy a 1000 psi rated silo. Since weapons accuracy was no better than 200–300 yards you needed surface burst weapons greater than 1 megaton to take out the target. Now advance to the early 2000’s. That same target can be destroyed with a 70 kt EPW with a 100 yard accuracy. Fast forward to 2020 and that target can be taken out with a 1 kt EPW weapon with an accuracy of 10 meters. It isn’t the same nuclear war we grew up under its familiar shadow. It is completely different. [7]A Common Story: “There are enough nuclear weapons to destroy the world many times over.”This is nothing more than poorly crafted fiction an urban legend. This common conclusion is not based in any factual data. It is based solely in hype, hysteria, propaganda and fear mongering.If you take every weapon in existence today, approximately 6500 megatons between 15,000 warheads with an average yield of 433 KT, [11] and put a single bomb in its own 100 square mile grid… one bomb per grid (10 miles x 10 miles), you will contain >95% of the destructive force of each bomb on average within the grid it is in. [12] This means the total landmass to receive a destructive force from all the world's nuclear bombs is an area of 1.5 million square miles. Not quite half of the United States and 1/38 of the world's total land mass…. that's it!In truth it would be far less. A higher concentration of detonations would take place over military targets and would be likely 10–30 times greater in concentration over those areas. [13] If they were used in war it is unlikely more than 40% would get used even in a total war situation. So the actual area of intense destruction in a nuclear war is somewhere between 150,000 and 300,000 square miles or 1/384 to 1/192 of the world’s land mass.These numbers are easily verifiable, and they are right. So many have bought into the endless rhetoric of the world shattering destructiveness and the inevitable end of civilization scenarios that they can no longer be objective or analytical as they have put their beliefs in front of rational thinking. I find this true even with most scientists. I challenge anyone to just do the math …it is easy.You win wars by taking out the opposing teams ability to make war, not their population centers. The arsenals of today are just enough to cover military objectives. There would be no wholesale war against civilians. That is just more fear mongering and Hollywood storytelling.Urban Legend: Nuclear Weapons Vaporize Everything in the FireballMuch of the actual structures that held the bombs during the above ground testing remained intact after the detonation. The blast heat is intense but brief. There is not enough thermal energy to vaporize large objects even near the hottest point with the greatest pressure of a detonation.“Observations of the remains of towers and shielding material after detonation at several ground zeros indicate that large masses of material are not vaporized. Observations of the residue of the Smoky tower [44 kt bomb atop a 700 foot high steel tower] indicated that a very significant portion of that tower remained, including the upper 200 feet of steel. Another example similar to Shot Smoky was Shot Apple II [29 kt atop a 500 ft steel tower], Teapot Series. Even though the total yield of Shot Apple II was about [29 kt], the floor of the cab [housing the nuclear bomb itself, at the top of the tower] and the main tower support columns remained intact. The results of the Shot Fizeau [11 kt atop a 500 ft steel tower] tower melt studies (W. K. Dolen and A. D. Thornborough, Fitzeau Tower Melt Studies, Sandia report SC-4185, 1958, Secret) show that about 85 percent of tower material was accounted for after the detonation and that only the upper 50 feet of tower was vaporized. No melting occurred beyond 175 feet from the top of the tower although the fireball theoretically engulfed more than 400 feet of the tower.”Dr Kermit H. Larson, et al., Distribution, Characteristics, and Biotic Availability of Fallout, Operation Plumbbob, weapon test report WT-1488, ADA077509, July 1966, page 59 [14]The Evolution of Military Doctrine - Minimize civilian casualties not take out the citiesThe Law of Armed Conflict (LOAC), is an extension of that part of customary international law regulating the conduct of armed hostilities. When considering the utility of nuclear weapons, two LOAC principles are most germane:the principles of military necessityand lawful targeting.The principle of military necessity calls for using only that degree and kind of force required for the partial or complete submission of the enemy, while taking into consideration the minimum expenditure of time, life, and physical resources. This principle is designed to limit the application of force to that required for carrying out lawful military purposes. Although the principle of military necessity recognizes that some collateral damage and incidental injury to civilians may occur when a legitimate military target is attacked, it does not excuse the wanton destruction of lives and property disproportionate to the military advantage to be gained. For the employment of any weapon, the weapons used should not cause more destruction than necessary to achieve military objectives. Consequently, a conventional weapon may be all that is needed or a smaller yield nuclear weapon may be preferred over a larger yield warhead, if the military objectives can still be achieved.In contrast, the principle of lawful targeting requires that all reasonable precautions be taken to ensure the targeting of only military objectives, so that damage to civilian objects (collateral damage) or death and injury to civilians (incidental injury ) is avoided as much as possible. This is often grouped under the term of “countervalue targeting”.[15]The view in 2016 from military law attorneys is that countervalue targeting is illegal under the Law of Armed Conflict (LOAC). That was not always the case. In the late 1940s, the U.S. did not have a declaratory nuclear doctrine. In the event of war, military leaders assumed that the few bombs in the nuclear inventory would be used against a small number of enemy cities as they were at Hiroshima and Nagasaki. In 1948 the Joint Chiefs of Staff (JCS) expanded the Hiroshima concept into a war plan for a single strategic air strike against major Soviet cities. It was argued that this would deter Moscow from starting a war for fear of the terrible destruction that American reprisals would inflict on the USSR.Today the US Policy is not to target civilians at all. The document, known as JSCP-N (formerly Annex C), provides nuclear planning guidance to combatant commanders in accordance with the Policy Guidance for the Employment of Nuclear Weapons (NUWEP) issued by the Secretary of Defense. The details of the targeting guidance is available to the public. [16]In 1949 the Soviet Union exploded its first nuclear weapon. The emerging nuclear arsenal of the USSR raised an overriding new requirement for U.S. doctrine. Although the JCS continued to plan for an attack against Soviet cities, destroying enemy nuclear weapons became the priority for American nuclear forces and remains so to this day. At the same time, U.S. leaders seriously debated whether to wage a preventive war in order to destroy Soviet nuclear forces before they could be used. In 1950, President Truman rejected preventive war as inconsistent with American values.During the Kennedy administration, the Secretary of Defense McNamara developed plans that limited the U.S. nuclear attacks to only one or two of the three traditional categories of targets: nuclear forces, other military, and urban-industrial. Under the revised declaratory doctrine, known as the "no cities" or "city hostage" doctrine, U.S. forces would first, in the event of Soviet aggression, strike military targets (categories one and two) and simultaneously threaten next to hit cities (category three targets), in order to deter Moscow from retaliating against American population centers. The "no-cities" doctrine represented a shift away from massive retaliation and towards a more calibrated response to Soviet aggression. Indeed, this increased targeting flexibility was adopted by NATO in 1967 when it formally approved the declaratory doctrine of flexible response. Under this declaratory doctrine, which remains in force today, [17]During the early 1960s, deterrence was discussed in countervalue terms. For example, Jerome Wiesner, science adviser to President John F. Kennedy and President Lyndon B. Johnson, testified before Congress that the U.S. could establish deterrence based on a threat to destroy six of the 10 largest Soviet cities. However, by the mid-1980s, U.S. officials began to publicly explain that the U.S. did not target civilian populations and instead targeted Soviet military assets, including nuclear forces.[18]The committee notes that although some scenarios show substantial nuclear-radiation-induced fatalities, military operational guidance is to attack targets in ways to minimize collateral effects. Calculated numbers of fatalities to be expected from an attack on an HDBT might be reduced by operational planning and employment tactics. Assuming that other strategic considerations permit, the operational commander could warn of a nuclear attack on an HDBT or could time such an attack to take advantage of wind conditions that would reduce expected casualties from acute and latent effects of fallout by factors of up to 100, assuming that the wind conditions were known well enough and were stable and that defenses against the attack could not be mobilized. However, a nuclear weapon burst in a densely populated urban environment will always result in a large number of casualties.[19]After the Korean War the U.S. Army’s revised the field manual on the law of land warfare introduced a new statement that expressed as doctrine the growing importance of intention. The revised 1956 manual said, “It is a generally recognized rule of international law that civilians must not be made the object of attack directed exclusively against them.” Previous army manuals had left this rule unexpressed. As a subculture, military professionals may have placed even more emphasis on their intentions not to harm noncombatants even in the face of widespread civilian deaths. While the sources make it difficult to assess the personal sentiments of officers and soldiers about civilian casualties during the Korean War, it is not hard to believe that many in private did not want to think of themselves as waging war against defenseless civilians.[20]Survival - Fallout is a short lived problem in most places.Using the 7/10 rule of exponential radionuclide decay, after just 49 days the radiation will be 1/10,000 the level it was an hour after the bombs went off and after a year and a half the radiation will have dropped below 1/100,000 of that initial level. The majority of bombs would be airburst which create little to no fallout which significantly reduces these dangers. [27]Surface bursts are tactically undesirable. Instead there would be earth penetrating warheads. The reason we no longer stock multi megaton weapons is one that our delivery accuracy no longer needs a sledgehammer approach, we hit what we aim for within feet and yards. Secondly the development of ground penetrating warheads vastly reduced the energy required to take out hardened targets. Ground penetrators while messy are not as bad of a fallout generator as a surface burst is, as there is little neutron activation of secondary materials which is a major contributor to surface burst fallout.[28][29]7/10 Rule chart above[30]Where are you safest from fallout?A regular cellar is not much better than being outside. A good fallout shelter has a rating of 1000, meaning it reduces your exposure to the fallout outside by a factor of 1000. A typical basement is only rated at a 10 which means you are dead if you are in the path of some major fallout.Places rated at a 1000 or higher:a sub-basement (basement under a basement) you need at least 6 feet of dirt over your head to protect you from all forms of radiation.the second level below street level of a concrete reinforced parking garage (obviously that also can be closed off at the entrance as well)the inner windowless rooms on the 4th floor or higher in a high-rise building (always leave at least 2 floors above you before the roof.According to FEMA these are your best bets. Whatever gives you the greatest distance from the source of the radiation is your best option. If none of these examples are available you just need to apply that distance guideline and some common sense. [31][32]Plan on being there at least 2 weeks and perhaps a month.A 2017 report by the Radiation Effects Research Foundation (RERF), a binational research organization funded by the governments of the United States and Japan, investigates the health effects of atomic bomb radiation among A-bomb survivors in Hiroshima and Nagasaki. The findings from the study of the atomic bomb survivors that have shown that the actual biological risk from nuclear radiation is surprisingly smaller than most people realize. The lifetime cancer death rate among those survivors went up less than one percent, and no biological effects at all have been detected among those who received lower doses (below 110 millisieverts). No multi-generational genetic damage has been detected either. [33]Nuclear WinterPolitically motivated bad science . For more detail see the answer where the footnotes take you. In general:Russell Seitz, Associate of the Harvard University Center for International Affairs, argues that the winter model's assumptions give results which the researchers want to achieve and is a case of "worst-case analysis run amok". Seitz criticized the theory for being based on successive worst-case events. [92]A recent and the most complete analysis done to date, indicates that the risk isn't real. Climate Impact of a Regional Nuclear Weapons Exchange: An Improved Assessment Based On Detailed Source Calculations. Fundamental assumptions and model parameters used in the past studies were wrong . Some of this was wrong by intent in order to get the desired results. A brief summarization below.our comprehensive urbanfire simulations indicate that the bulk of the carbon mass remains in the troposphere, where it is quickly removed from the atmosphere. In most previous work, for example, that of Stenke et al. (2013) and Mills et al. (2014), all of the soot produced by the urbanfires is directly injected near the top of the troposphere, and therefore much of it rises into the stratosphere, where it shades and cools the Earth. In contrast, if we use a realistic vertical profile for the BC aerosols as input to the climate model, the long-term global impacts on climate are much less severe than predicted by previous studies. This was true even with conservative, worst case assumptions regarding BC production.[103]The truth is out there…..
How many past long term predictions about global warming/climate change are true versus false?
Most know that the wild predictions of the alarmists particularly the huckster claims of sea level rise of the now infamous Al Gore are false. The truth is none of the claimed predictions have happened or will happen. Andy May has put together an excellent analysis with details of the failed climate predictions. I have added data to his Reality check showing the failure with an outline in yellow.Some Failed Climate PredictionsAndy May / October 30, 2017By JavierHere, for the first time in public, is Javier’s entire collection of massive, “consensus” climate science prediction failures. This collection is carefully selected from only academics or high-ranking officials, as reported in the press or scientific journals. Rather than being exhaustive, this is a list of fully referenced arguments that shows that consensus climate science usually gets things wrong, and thus their predictions cannot be trusted.To qualify for this list, the prediction must have failed. Alternatively, it is also considered a failure when so much of the allowed time has passed that a drastic and improbable change in the rate of change is required for it to be true. Also, we include a prediction when observations are going in the opposite way. Finally, it also qualifies when one thing and the opposite are both predicted.A novelty is that I also add a part B that includes obvious predictions that consensus climate science did not make. In science you are also wrong if you fail to predict the obvious.A. Failed predictions1. Warming rate predictions1990 IPCC FAR: “Under the IPCC ‘Business as Usual’ emissions of greenhouse gases the average rate of increase of global mean temperature during the next century is estimated to be 0.3°C per decade (with an uncertainty range of 0.2°C – 0.5°C).” See here, page xi.Reality check: Since 1990 the warming rate has been from 0.12 to 0.19°C per decade depending on the database used, outside the uncertainty range of 1990. CO2emissions have tracked the “Business as Usual” scenario. An interesting discussion of the 1990 FAR report warming predictions and an analysis of them through April of 2015 can be seen here. A list of official warming rates from various datasets and for various time spans can be seen here.2. Temperature predictions1990 IPCC FAR: “Under the IPCC ‘Business as Usual’ emissions of greenhouse gases … this will result in a likely increase in global mean temperature of about 1°C above the present value by 2025.” See here, page xi.Reality check: From 1990 to 2017 (first 8 months) the increase in temperatures has been 0.31 to 0.49°C depending on the database used. CO2 emissions have tracked the Business as Usual scenario.Figure 1. CMIP5 climate models developed by 2010 still predict more warming than observed, only a few years later. Source here.3. Winter predictions2001 IPCC TAR (AR3) predicts that milder winter temperatures will decrease heavy snowstorms, see here.2014 Dr. John Holdren, director of the Office of Science and Technology Policy for the Obama administration said: “a growing body of evidence suggests that the kind of extreme cold being experienced by much of the United States as we speak is a pattern we can expect to see with increasing frequency, as global warming continues.” See here.Reality check: By predicting both milder winters and colder winters the probability of getting it right increases. Now, to cover all possibilities they simply need to predict no change in winters.Germany – Coldest September morning since weather records began!September 26, 2018 by Robert4. Snow predictions2000 Dr. David Viner, a senior research scientist at the climatic research unit (CRU) of the University of East Anglia, predicts that within a few years winter snowfall will become “a very rare and exciting event”. “Children just aren’t going to know what snow is.” See here.2001 IPCC TAR (AR3) predicts that milder winter temperatures will decrease heavy snowstorms. See here.2004 Adam Watson, from the Centre for Ecology and Hydrology in Banchory, Aberdeenshire, said the Scottish skiing industry had no more than 20 years left. See here.Reality check: 2014 had the snowiest Scottish mountains in 69 years. One ski resort’s problem was having some of the lifts buried in snow. See here.Most snow in hills in 69 years, says Hamish MacInnesBy Steven McKenzieBBC Scotland Highlands and Islands reporter26 February 2014Image copyrightANDREAS HEINZL/UNEXPLORED SCOTLANDImage captionA climber at the top of North Gully on An TeallachWorld-renowned climber Hamish MacInnes believes this winter in Scotland's mountains is the snowiest since 1945.He said he had not seen such "colossal volumes" of snow since he started climbing as a youngster 69 years ago.Dumfries and Galloway-born Mr MacInnes is also the inventor of mountaineering equipment, including a stretcher.He said: "The first time I went climbing was in 1945 and I remember cutting our way through snow in Glencoe."I've not seen anything like it until now."This covering of snow we have just now is very alpine. There is a very defined demarcation line between where the snow starts and the bare grass below."The volume of snow is colossal. It has been falling for weeks now."Reality check: Northern Hemisphere snow area shows remarkable little change since 1967. See here. The 2012-2013 winter was the fourth largest winter snow cover extent on record for the Northern Hemisphere. See here.5. Precipitation predictions2007 IPCC AR4 predicts that by 2020, between 75 and 250 million of people are projected to be exposed to increased water stress due to climate change. In some countries, yields from rain-fed agriculture could be reduced by up to 50%. See here.Reality check: Only six years later, IPPC acknowledges that confidence is low for a global-scale observed trend in drought or dryness (lack of rainfall) since the middle of the 20th century, and that AR4 conclusions regarding global increasing trends in drought since the 1970s were probably overstated. See here, page 162.6. Extreme weather predictions2010 Dr. Morris Bender, from NOAA, and coauthors predict that “the U.S. Southeast and the Bahamas will be pounded by more very intense hurricanes in the coming decades due to global warming.” They say the strongest hurricanes may double in frequency. See here.Reality check: After 40 years of global warming no increase in hurricanes has been detected. NOAA U.S. Landfalling Tropical System index shows no increase, and in fact, a very unusual 11-year drought in strong hurricane US landfalls took place from 2005-2016. See NOAA statistics here.Meteorologists warned the Carolinas that Hurricane Florence might be the “storm of the century,” inundating the southeast coastline with catastrophic wind and “biblical” flooding. But one Weather Channel meteorologist was caught on Friday exaggerating the storm’s power.In a viral video that’s now been viewed millions of times, Weather Channel meteorologist Mike Seidel is seen leaning into the wind and rain, appearing to struggle to maintain his balance in the brutal weather.Then Seidel’s cameraman pans out — and captures two men casually strolling in the background.North AtlanticIn August, Hurricane Harvey ended a record-long major hurricane (Category 3 or stronger) landfall drought in the United States. Prior to Harvey, the last major hurricane to make landfall in the U.S. was Wilma on October 24, 2005. This major hurricane drought surpassed the length of the eight-years from 1861-1868 when no major hurricane struck the United States' coast. On average, a major hurricane makes landfall in the U.S. about once every three years. The reliable record of landfalling hurricanes in the U.S. dates back to 1851.https://www.ncdc.noaa.gov/monitoring-content/sotc/tropical-cyclones/2017/08/Harvey_sat.pngIPCC AR5 (see here) states “Current datasets indicate no significant observed trends in global tropical cyclone frequency over the past century … No robust trends in annual numbers of tropical storms, hurricanes and major hurricanes counts have been identified over the past 100 years in the North Atlantic basin”“In summary, there continues to be a lack of evidence and thus low confidence regarding the sign of trend in the magnitude and/or frequency of floods on a global scale”“In summary, there is low confidence in observed trends in small-scale severe weather phenomena such as hail and thunderstorms”7. Wildfire predictions2001 IPCC TAR (AR3) said that fire frequency is expected to increase with human-induced climate change, and that several authors suggest that climate change is likely to increase the number of days with severe burning conditions, prolong the fire season, and increase lightning activity, all of which lead to probable increases in fire frequency and areas burned. See here.2012 Steve Running, a wildfire expert, ecologist and forestry professor at the University of Montana says the fires burning throughout the U.S. offer a window into what we can expect in the future as the climate heats up. See here.Reality check: The global area of land burned each year declined by 24 percent between 1998 and 2015, according to analysis of satellite data by NASA scientists and their colleagues. Scientists now believe the decrease in forest fires is increasing 7% the amount of CO2 stored by plants. See here.The global area of land burned each year declined by 24 percent between 1998 and 2015, according to analysis of satellite data by NASA scientists and their colleagues. The largest decline was seen across savannas in Africa, and due to changing livelihoods.Credit: Joshua Stevens/NASA's Earth Observatory8. Rotation of the Earth predictions2007 Dr. Felix Landerer of the Max Planck Institute for Meteorology in Hamburg, Germany, published a study predicting that Global warming will make Earth spin faster. See here.2015 Dr. Jerry Mitrovica, professor of geophysics at Harvard University finds out that days are getting longer as the Earth spins slower, and blames climate change. See here.Reality check: Doing one thing and its opposite simultaneously has always been possible for climate change. However, the International Earth Rotation and Reference Systems Service (IERS) informs us that the Earth slowed down from the start of measurements in 1962 to 1972, and sped up between 1972 and 2005. Since 2006 it is slowing down again. It shows the same inconsistency as global warming. See here.9. Arctic sea ice predictions2007 Prof. Wieslaw Maslowski from Dept. Oceanography of the US Navy predicted an ice-free Arctic Ocean in summer 2013, and said the prediction was conservative. See here.2007 NASA climate scientist Jay Zwally predicted that the Arctic Ocean could be nearly ice-free at the end of summer in 2012. See here.2008 University of Manitoba Prof. David Barber predicted an ice-free North Pole for the first time in history in 2008, see here.2010 Mark Serreze, director of the NSIDC predicts the Arctic will be ice free in the summer by 2030, see here.2012 Prof. Peter Wadhams, head of the polar ocean physics group at the University of Cambridge (UK), predicted a collapse of the Arctic ice sheet by 2015-2016, see here.Reality check: No decrease in September Arctic sea ice extent has been observed since 2007, see here and here.Evidence that multidecadal Arctic sea ice has turned the cornerGuest Blogger / October 7, 201610. Polar bear predictions2005 The 40 members of the Polar Bear Specialist Group (PBSG) of the World Conservation Union decided to classify the polar bear as “vulnerable” based on a predicted 30 percent decline in their worldwide population over the next 35 to 50 years. The principal cause of this decline is stated to be climatic warming and its negative effects on the sea ice habitat. See here.2017 The US Fish and Wildlife Service releases a report concluding that human-driven global warming is the biggest threat to polar bears and that if action isn’t taken soon the Arctic bears could be in serious risk of extinction. “It cannot be overstated that the single most important action for the recovery of polar bears is to significantly reduce the present levels of global greenhouse gas emissions.” See here.2010 Science: Fake polar bear picture chosen to illustrate a letter to Science about scientific integrity on climate change. You just can’t make this stuff up. See hereand here.Figure 2, the fake picture (left) published in Science, May, 2010.Reality check: Average September Arctic sea ice extent for the 1996-2005 period was 6.46 million km2. It declined by 26% to 4.77 million km2 for the 2007-2016 period. Despite the sea ice decline the polar bear population increased from a 20,000-25,000 estimate in 2005 to a 22,000-31,000 estimate in 2015. See here.11. Glacier predictions2007 IPCC AR4 says there is a very high likelihood that Himalayan glaciers will disappear by the year 2035 and perhaps sooner if the Earth keeps warming at the current rate. See here.IPCC officials recanted the prediction in 2010 after it was revealed the source was not peer-reviewed. Previously they had criticized the Indian scientist that questioned the prediction and ignored an IPCC author than in 2006 warned the prediction was wrong. See here.12. Sea level predictions1981 James Hansen, NASA scientist, predicted a global warming of “almost unprecedented magnitude” in the next century that might even be sufficient to melt and dislodge the ice cover of West Antarctica, eventually leading to a worldwide rise of 15 to 20 feet in the sea level. See here.Reality check: Since 1993 (24 years) we have totaled 72 mm (3 inches) of sea level rise instead of the 4 feet that corresponds to one-fourth of a century. The alarming prediction is more than 94% wrong, so far. See here.A NASA study, published in the Journal of Glaciology in 2015, claims that Antarctic ice mass is increasing. See here. Antarctic sea ice reached a record extent in 2014, see here.13. Sinking nations predictions1989 Noel Brown, director of the New York office of the U.N. Environment Program (UNEP) says entire nations could be wiped off the face of the Earth by rising sea levels if the global warming trend is not reversed by the year 2000. As global warming melts polar icecaps, ocean levels will rise by up to three feet, enough to cover the Maldives and other flat island nations. See here.Reality check: Tide gauges referenced by GPS at 12 locations in the South Pacific reported variable trends between -1 to +3 mm/year for the 1992-2010 period. See here.The Diego Garcia atoll in the Indian ocean experienced a land area decrease of only 0.92% between 1963 and 2013. See here.The Funafuti atoll has experienced a 7.3% net island area increase between 1897 and 2013. See here.RESEARCH ARTICLE|JUNE 01, 2015Coral islands defy sea-level rise over the past century: Records from a central Pacific atollP.S. KenchD. ThompsonM.R. FordH. OgawaR.F. McLeanGeology (2015) 43 (6): 515-518.https://doi.org/10.1130/G36555.1AbstractThe geological stability and existence of low-lying atoll nations is threatened by sea-level rise and climate change. Funafuti Atoll, in the tropical Pacific Ocean, has experienced some of the highest rates of sea-level rise (∼5.1 ± 0.7 mm/yr), totaling ∼0.30 ± 0.04 m over the past 60 yr. We analyzed six time slices of shoreline position over the past 118 yr at 29 islands of Funafuti Atoll to determine their physical response to recent sea-level rise. Despite the magnitude of this rise, no islands have been lost, the majority have enlarged, and there has been a 7.3% increase in net island area over the past century (A.D. 1897–2013). There is no evidence of heightened erosion over the past half-century as sea-level rise accelerated. Reef islands in Funafuti continually adjust their size, shape, and position in response to variations in boundary conditions, including storms, sediment supply, as well as sea level. Results suggest a more optimistic prognosis for the habitability of atoll nations and demonstrate the importance of resolving recent rates and styles of island change to inform adaptation strategies.Tuvalu14. Food shortage predictions1994 A study, by Columbia and Oxford Universities researchers, predicted that under CO2 conditions assumed to occur by 2060, food production was expected to decline in developing countries (up to -50% in Pakistan). Even a high level of farm-level adaptation in the agricultural section could not prevent the negative effects. See here.2008 Stanford researchers predicted a 95% chance that several staple food crops in South Asia and Southern Africa will suffer crop failures and produce food shortages by 2030, due to 1°C warming from the 1980-2000 average. See here.Reality check: On average, food production in developing countries has been keeping pace with their population growth. Pakistan, with 180 million people, is among the world’s top ten producers of wheat, cotton, sugarcane, mango, dates and kinnow oranges, and holds 13th position in rice production. Pakistan shows impressive and continuously growing amounts of agricultural production, according to FAO. See here.15. Climate refugee predictions2005 Janos Bogardi, director of the Institute for Environment and Human Security at the United Nations University in Bonn and the United Nations Environment Program (UNEP) warned that there could be up to 50 million environmental refugees by the end of the decade. See here.2008 UN Deputy secretary-general Srgjan Kerim, tells the UN General Assembly, that it had been estimated that there would be between 50 million and 200 million environmental migrants by 2010. See here.2008 UNEP Map showing the areas of origin of the 50 million climate refugees by 2010. See here.Figure 3. Fifty million climate refugees by 2010. Climate refugees will mainly come from developing countries, where the effect of climate changes comes on top of poverty and war. UNEP/GRID-Arendal map, source here.2011 Cristina Tirado, from the Institute of the Environment and Sustainability at UCLA, says 50 million “environmental refugees” will flood into the global north by 2020, fleeing food shortages sparked by climate change. See here.Reality check: As of 2017 only one person has claimed climate change refugee status: The world “first climate change refugee” Ioane Teitiota from Kiribati. His claim was dismissed by a court in New Zealand in 2014. See here.16. Climate change casualty predictions1987 Dr. John Holdren, director of the Office of Science and Technology Policy for the Obama administration then a professor at U.C. Berkeley was cited by Paul Ehrlich: “As University of California physicist John Holdren has said, it is possible that carbon dioxide climate-induced famines could kill as many as a billion people before the year 2020.” See here.2009 Dr. John Holdren, director of the Office of Science and Technology Policy for the Obama administration, when questioned by Sen. David Vitter admitted that 1 billion people lost by 2020 was still a possibility. See here.Reality check: There was a 42% reduction in the number of hungry and undernourished people from 1990-1992 to 2012-2014. Currently, the world produces enough food to feed everyone. Per capita food availability for the whole world has increased from 2,220 kcal/person/day in the early 1960’s to 2,790 kcal/person/day in 2006-2008. See here.17. Time running out predictions1989 Noel Brown, director of the New York office of the U.N. Environment Program (UNEP) says that within the next 10 years, given the present loads that the atmosphere must bear, we have an opportunity to start the stabilizing process. See here.2006 NASA scientist James Hansen says the world has a 10-year window of opportunity to take decisive action on global warming and avert catastrophe. See here.2007 U.N. Scientists say only eight years left to avoid worst effects See here.B. Failure to predict1. A greener planet1992 The CO2 fertilization effect was well known, and experiments since at least 1988 showed that farm yields increased significantly. This was an easy prediction to make, yet it was ignored. See here.In 2007 the IPCC was still downplaying the importance of the effect: “Since saturation of CO2 stimulation due to nutrient or other limitations is common, it is not yet clear how strong the CO2 fertilization effect actually is.” See here.However recent satellite image analysis of changes in the leaf area index since 1982have demonstrated a very strong greening over 25-50% of the Earth. CO2fertilization is responsible for most of the greening, with the increase in temperatures also contributing. See here.2. Increase in forest biomass2006: For four of the past five decades global forest dynamics were thought to be primarily driven by deforestation. It was only in the last decade when it was noticed that a great majority of reports were contradicting that assumption. “Of the 49 papers reporting forest production levels we reviewed, 37 showed a positive growth trend.” The authors also write “climatic changes seemed to have a generally positive impact on forest productivity” when sufficient water is available. See here.2010: The observed forest biomass increase was found to greatly exceed natural recovery, and was attributed to climate change, through changes in temperature and CO2. See here.2015: Satellite passive microwave observations demonstrate that the trend is global and is accompanied by a recent decrease in tropical deforestation. See here.3. Carbon sinks increases1992: In the late 80’s a “missing sink” was discovered in the carbon budget accounting, and was discussed through the 90’s. The possibility that Earth’s oceans and terrestrial ecosystems could respond to the increase in CO2 by absorbing more CO2 had not occurred to climate scientists, and when it occurred to them they mistakenly thought that deforestation would be a higher factor. See here.4. Slowdown in warming2006: Professor Robert Carter, a geologist and paleoclimatologist at James Cook University, Queensland, was one of the first to report the unexpected slowdown in warming that took place between 1998 and 2014. See here.LORD MONCTON ALLEGES FRAUD IN DATA IN IPCC 4TH REPORTMonckton challenges the IPCC – suggests fraud – and gets a responseAnthony Watts / May 20, 2013The IPCC fraud case (but not the planet) hots upGuest essay by Christopher Monckton of BrenchleyGlobal warming is not accelerating. The planet is not hotting up. There has been no warming for 17 years on any measure, as the IPCC’s climate-science chairman now admits. That includes the Hadley/CRU data. There has been no warming for 23 years according to RSS satellite dataset.The IPCC’s central projection of warming since 2005 (bright red), taken from the forthcoming Fifth Assessment Report, is visibly at odds with the linear-regression trend (bright blue) on the latest version (HadCRUt4) of the monthly global mean surface temperature anomaly curve (dark blue):I received no reply to my report of the IPCC’s erroneous conclusion that global warming was “accelerating”. So today I wrote to the IPCC again:“I am an expert reviewer for the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. I wrote to you two weeks ago to report a serious error in the Fourth Assessment Report. I have had no reply. My letter of two weeks ago is attached, together with a copy of a letter I have sent to the Inter-Academy Council asking it to use its good offices to persuade you to reply. I have also sent a letter, for information only at this stage, to the police in Geneva, since it appears that a fraud may have been committed by the IPCC.”In my letter to the police in Geneva, which I also copied to the Serious Fraud Office in London and the Office of the Attorney General of the Commonwealth of Virginia, I wrote:“The attached correspondence evidences a fraud at the IPCC. Its secretariat has not responded to my report of an error in its Fourth Assessment Report (2007). The error is serious. I can prove it is deliberate. It is designed to demonstrate by deception that the world is warming ever faster and that we are to blame. It is one of a series of ingenious, connected frauds that have profited a few at great expense to many.“The frauds are wilful deceptions calculated to cause loss to taxpayers by tampering with scientific data and results so as to exaggerate the rate and supposed adverse consequences of global warming. Scientific debate is legitimate: subjective distortion of objective science for profit is not.“This letter is for information. If after a further week the IPCC (to which I am copying this letter) fails to acknowledge my report of its error as its own procedures require, I shall invite you to investigate this and other connected frauds, which involve larger sums than any previous fraud.”The IPCC has not delayed in replying this time:“We acknowledge receipt of your message copied below and of your letter dated 4 May 2013, received earlier today as an attachment to that message. Your email with attachments of today is the first communication received at the IPCC Secretariat from you on this matter.“We would like to inform you that the error claim that you have submitted is now being taken care of as per the IPCC Protocol for Addressing Errors in IPCC Assessment Reports, Synthesis Reports, Special Reports or Methodology Reports, available on the IPCC website. Steps 1 and 2 of the protocol are now completed; the IPCC Working Group I will deal with next steps as appropriate. As per the protocol, the IPCC Secretariat will inform you of the conclusions of the process.”I have thanked the IPCC for passing on my report of its error in the Fourth Assessment Report and have told the police the IPCC have now replied. It is clear from the IPCC Secretariat’s reply that Dr. Pachauri, to whom I had reported the error in writing and in person as long ago as 2009, had not passed my report of the error to the Secretariat as he should have done. No doubt there will now be an internal enquiry to discover why he did not pass it on.When the error has been investigated and the IPCC has reported back to me, I shall let you – and the prosecuting authorities of three nations – know the outcome.Monckton challenges the IPCC – suggests fraud – and gets a responseThe scientific climate community essentially ignored the issue until 2013 and have recently become split on its reality, with a small group negating it even took place. Nobody in the scientific community is even considering the possibility that the “Pause” might not have ended and was only temporarily interrupted by the 2015-16 big El Niño.ConclusionsThere is only one possible conclusion regarding the reliability of climate predictions. Outspoken catastrophic-minded climate scientists and high-ranking officials don’t have a clue about future climate and its consequences, and are inventing catastrophic predictions for their own interest. Government policies should not be based on their future predictions.Another conclusion is that studies and opinions about future climate are heavily biased towards negative outcomes that fail to materialize, while ignoring positive outcomes that are materializing.This post was edited a little by Andy May, who believes the only safe prediction is that the predictions of “consensus scientists” will continue to be wrong.https://wattsupwiththat.com/2017/10/30/some-failed-climate-predictions/
Why do some people hate cops?
This is a big question. In order to understand why people hate the police, we first have to understand why police officers behave the way they do, and the judicial system that 9 out 10 supports them. As I read all the answers given here, I am reminded of how much work we still have to do as a society to bring positive change to law enforcement agencies. Despite what we have learnt about police brutality and misconduct, many people continued to blame the "few bad apples" for ruin the Department's reputation, and they quickly jumped into conclusion denouncing these individuals saying they only represent an extremely small group of minority.For the LAPD, the blame goes even further with the revamp of police recruiting andselection practices that focus on deterring unfit individuals to becoming members of the organization, which reinforces the Department’s framing of police misconduct as a consequence of a few “bad apple” cops who slipped through the screening process, and these individuals do not reflect the culture of the organization. I reject this argument, and I tell you why.From the Systems theoretical standpoint, law enforcement agencies are considered an organizational system, which exists within the context of a larger social system that influences the way in which the organization operates and becomes dysfunctional. I argue that the environment (social culture) within which Law Enforcement operates is the deciding factor in determining organizational culture and behavior. Social culture influences organizational culture, as police officers bring their learned assumptions to judgments and decisions in the work environment, and this interaction between a perilous social culture of urban cities and the adaptive culture of the organization that creates the perfect formula for dysfunction. Police corruption is, by and large, supported by social structures and attitudes that are embedded in local society. Conventional wisdom tells us bad apples don’t fall far from the free.Today, it is not uncommon to turn on the news and hear that somewhere in the United States somebody was hurt or killed by the police. Living and workingin Los Angeles, I get to hear about this almost on a daily basis. Because I work in the mental health field, sometimes these tragic stories hit closer to home for me than I would have liked. A few years ago, a 37 year old homeless man with mental illness named Kelly Thomas was fatally beaten by six local police officers at a bus depot in Fullerton, California. According to the Huffington Post, the beating incident was captured by a bystander with a cell phone, and bus surveillance tape released later showed how officers beat Thomas and used a stun gun on him repeatedly as he cried out for his father. In an interview, Thomas’ father reports, "When I arrived at the hospital to see him, I honestly thought that gang bangers had got a hold of him like the cowards sometimes do and just beat him with a baseball bat in the face. Immediately my thoughts were to get with Fullerton police ... and I didn't learn until a certain amount of hours later the truth. That put me in absolute shock" (Huffington Post, 2011). The beating was so unjustifiably brutal that the story quickly became international news. The Department quickly issued a statement saying that the case was an isolated incident, and the officers did receive training on how to deal with the mentally ill and the homeless. This is a classically individual-level analysis, leading to the notion that the problem lies within the character of a few “bad apples”.Another recent case of police brutality happened at the Pacific Clinics in Rosemead, California. Though this particular case did not receive the same amount of media attention given to the Thomas’ case, it felt closer tohome because of the relationship between Pacific Clinics and the agency I worked for, APCTC. One of our staff psychiatrists also worked for Pacific Clinics, and the person who was shot and killed by the deputies from the sheriff’s Temple Station was his patient. The Pasadena Star-News reports three deputies from the sheriff's Temple station were involved in the fatally shooting of a mentally ill woman sitting inside the clinic with a hammer in her lap. The victim was identified as Jazmyne Ha Eng, 40 years old, 4-foot-11, and 95-pound Cambodian woman with a history of psychological disorders, and she was wielding a "full-sized" ball-peen hammer when she was shot and killed by deputies. The initial incident report stated that a deputy tried to shock Eng with his taser gun, but it was not effective. Eng, then, advanced toward the deputies with a hammer. Fearing for his safety, a deputy fired two rounds from his duty weapon. Eng was pronounced dead at the scene. The coroner’s report reveals no trace of taser dart found on Eng’s body, only the shot wounds that killed her (SGV Tribune, 2012).Cases of police abuse and corruption often occur more frequently than publicly reported (Bayley & Perito, 2011; Weisburd, Greenspan, Hamilton, Williams, and Bryant, 2000). Of those reported, a few serious cases ignite intense public debate. For instance, the Los Angeles PoliceDepartment issued a report by a board of inquiry into the “Rampart Area Corruption Incident” in 2000, prompted allegations of bank robbery, false arrest, falsifying reports, theft of cocaine from the police property room, and beating of handcuffed suspects. The Rampart Scandal is often referred to as one of the most widespread cases of police corruption in U.S. history, with more than 70 officers implicated in the worst corruption scandal in LAPD history.The Los Angeles Police Department developed a version of the storyimplicating a very small group of Black and Hispanic officers were responsiblefor all the misconduct (LAPD, 2000). Two other high profile cases in recent memory are the Rodney King beating by LAPD police and the torture of Abner Louima by New York City police.Poverty, Ethnicity,and Crimes: A Sociocultural PerspectiveMost cases involved police misconduct occur in large urban cities, such as Los Angeles, New York, Chicago, Detroit, Baltimore, and so on. Studies on the intersection of urban poverty, crime, and the racial divide show a correlation between variations in crime rates and socioeconomic status and race, with crime rates (mostly gang and drug related crimes) higher in urban poor communities, as these low-socioeconomic neighborhoods experienced great levels of poverty, racial heterogeneity, transience, family disruption (Bobo, 2009; Devah, 2007; & Venkatesh,2000), high unemployment rate, unequal access to quality education, unequalaccess to police services, legal aid services, and disparities in political representation(Bartels, 2005), thus creating a society that tolerates and embraces unlawfulbehaviors of ingroup members, and thereby, providing ample opportunities forpolice abuse of power and corruption.Sociocultural theory posits that our cognitive development and learningprocesses are influenced by societal culture, leading to the notion that ourbeliefs, moral values, attitudes, manners, normative behaviors, and work ethics, embody the societal culture in which we are accustomed to (Vygotsky, 1986). Thisperspective assumes that our social mores teach us right from wrong, and that ourattitude and behavior are culturally dependent. In other words, people are simply the products of their societal culture, implicating possible consequences for those who grew up in a “bad” neighborhood because of the likelihood that they will become “bad” just like their social environment. Possibly, this may be the reasoning factorbehind the policy that mandates automatic disqualifications of any convictedfelons from becoming police officers by the LAPD, even though studies show supervisor background ratings are not useful in predicting integrity problems (Fischler, 2009).Moreover, disqualification of ex-convicts does not stop nor explain why do some police officers with clean records routinely violate the laws they publicly sworn to uphold. It is not uncommon to find some clean rookie cops turning dirty after being on the job for a relatively short time, suggesting that law enforcement, as an organization, has been influenced by the external social culture there by integrating internal processes in order to survive in the external environment in which it operates. Consequently, this need to conform and eventuallydominate the parent culture leads the organization to dysfunction. From this perspective, punishing a few “bad apples”, in addition to preventing felons from slipping through the crack during selection process, only indicates the organization has missed the mark and the “real culprit” has not yet identified, and therefore, the problem is left unchanged.While disqualifying certain criminals from becoming police officers maybe the right card to play in the game of social politics, but it does little to the effect of preventing future cases of police misconduct from reoccurring. Undoubtfully, law enforcement, as an organization, is aware of certain personality types of the men and women they select for police work, especially extensive data on personality traits of the selected, the nature of the job, and the operational structure of the organization, are readily available to assist in determining which candidate best suited for the job. It is difficult to imagine a powerful organization, such as the US law enforcement, is ignorant to the fact that there are considerable overlaps of day-to-day activities between armed police officers and criminals.Meta-analysis that compared personality traits between two adversarial groups— the police and criminals—shows a stark similarity between the two groups, including stress, long hours, tension, life threatening situations, the use of coercion, the expectation of conflict, a code of silence, and the opportunity to work in large powerful gangs (Wisenheimer, 2009). The personality traits and work attitudes of108 criminals convicted of assault and 96 armed police officers were alsocompared using the MMPI, Zuckerman-Kuhlman Personality Questionnaire. The results reveal both groups scored significantly higher than the general population and various occupational groups on the following scales: Impulsive Sensation, Aggression-Hostility Work Activity. Both groups scored significantly lower on: Neuroticism-Anxiety, Sociability. No significant differences were observed onthe Lie scale.The study goes on to list many more variables shared by both groups, concluding that armed police officers and violent criminals are “two sides of the same coin, united in an unbreakable bond”. Wisenheimer (2009) even recommended their commission for active police duties within the anti-gang crime unit of the police department. More on psychological screening is discussed later in the article. For now, why do some law enforcement agencies, such as the Los Angeles Police Department, automatically disqualify ex-convicts of violent crimes but selectcandidates of equal potential in breaking the law, is unclear, but I speculate that the answer lies in the dynamics between the American public and the politics of law enforcement.Poverty, racial differences, social class, dense population of criminal offenders, and high crime rates are powerful aspects of urban cultures to influencepolice behavior (Weisburd, 2009). Anecdotal evidence suggests a correlation between poverty-stricken minority community and high crime rates. Conventionalwisdom suggests that residents of poor urban neighborhoods face a higher riskof criminal victimization than other city residents. In addition, racial and ethnic minorities, African-Americans in particular, are considerably more likely to be victimized and incarcerated (Raphael & Sills, 2006). In addition to high unemployment rates, these high incarceration and victimization (often use of excessive force with intention to cause serious bodily harm or kill) rates are often attributed to a higher propensity among ethnic minorities (males in particular), especially African Americans, to criminally offend, creating a dysfunctional society that perpetuates police brutality and corruption.We know that crime, whether committed by citizens or law enforcement, can and do happen in all communities. News media frequently reports crimes committed in large urban cities, with poor minority neighborhoods suffer disproportionately due to police abuse of power. (Bayley & Perito, 2011).Critics of police brutality argue that the racial/ethnic difference between citizens and the police is one of the main reasons leading to police abuse of authority (Weisburd et al, 2000). Studies show that police harassment of minorities is not an isolated occurrence (Grant, 2003; Weisburd et al, 2000; Kaplan, 2009; & Greenspan, Weisburd, & Bryant, 1997). For example, two studies conducted across two Midwestern States, Illinois and Ohio, found more than 25% of minority police officers observed police using considerably more force than necessary when apprehending a suspect of ethnic minority, and harassing a citizen because of his or her race (Martin, 1994; & Knowles, 1996). Recently, the use of racial profiling by law enforcement in the State of Arizona had the Latino community protesting in the streets across the country, accusing the enactment of the law, known as SB 1070, unconstitutional. This is because a subsection of this piece of legislator allows law enforcement agents to stop a person at any given place and time (theLatino community is implicitly targeted), whom they believed to look like an illegal alien, and demand proof of citizenship. Many Americans, especially Mexican Americans, believe SB 1070 is racially motivated, and they claimed that racial profiling constitutes harassment and a violation of basic human rights.Bad Apples Don’t Fall Far From the TreeLaws likethe SB 1070 rarely happen in a vacuum. According to the annual report issued by the Southwest Border HIDTA Arizona Partnership in 1990, a subdivision of the Office of National Drug Control Policy, the two major metropolitan areas in the region, Tucson and Phoenix, are primary distribution centers and drug transit areas with theirclose proximity and easy access to the Arizona Sonora, Mexico border. An estimate of multi-ton quantities of cocaine, marijuana, methamphetamine, and heroine are smuggled into the cities to be distributed across the States annually. Thereport indicates drug related violence, homicides, and property crimes areincreasingly perpetuated by drug gangs and abusers. Drug proceeds are smuggled from Arizona to Mexico in bulk form by vehicles, commerce shipments, pedestrians, and so on. The report also indicates that the increased interdiction by law enforcement has frustrated narcotic smugglers, leading to increased acts of violence toward law enforcement agents.Over the course of extended interaction between two cultures—border patrol and drug gangs—many cops succumb to the dominant culture (drug gangs) at large and join in the actions, while others refuse to conform and be tainted. Not everyresident in a “ghetto neighborhood” is a gang member, as not every cop is ahero. Nevertheless, the development of a culture of poor urban society seems to embrace violence and corruption as symbol of identity, upon which economic marginalization leads to anger and resentment, and as motivation for crime and violence. Today, the national war on drugs rages on, where law enforcers and the outlawed continue to play cat-and-mouse, reflecting the external societal culture in peril shaping the culture of law enforcement agencies, as agents bring their learnedassumptions to judgments and decisions in the work environment, and finally endup either colluding with law offenders in corruptions or putting ordinarycitizens in harmed ways for personal gain. Consequently, police brutality and corruption occur routinely. Again, the same argument goes: three cops cango crazy, but more than 70 cops, for instance, embody a culture of policing,where such development of organizational culture conforms to the same identitychosen by the social culture where they grow up.The following are cases in point. A border patrolcorruption prompted a reform of immigration law in the State of Arizona leadingup to the enactment of SB 1070 in 2011. According to ABC news investigators, atotal of 134 patrol agents in Naco, Arizona were arrested or indicted forcorruption in the past 7 years (ABC News, 2011). One agent used his own patrol car to smuggle drugs. Another case concerns an agent allowing illegal aliens through a point of entry without checking their documentation. Another case involved an agent pleading guilty to selling national security documents. The border patrol relies on sensors embedded in the ground to track smugglers. Less than a year ago, an agent in Tucson pled guilty to giving a drug trafficker the locations of more than 100 of thesensors. Prosecutors say he did it for a $3,000 bribe.Clearly, the effect of social culture on organizational culture is profound. Remedies for police corruption would seem to depend upon local social dynamics and traditions, as well as the capacity of local jurisdictions to manage them. Police corruption is typically supported by social structures and attitudes that are embedded in local ways of life. Thus, changing organizational cultures requires the transformation of local cultures.Blame a Few “Bad Apples”Won’t WorkLaw enforcementorganizations have grown much more diverse in recent years (Sklansky,2006). Today’s large American cities, such as the Los Angeles PoliceDepartment and New York Police Department, are not the homogeneous workplace large numbers of female, openly gay and lesbian officers, and minority officers. Openly gay and lesbian officers, too, are well represented in Los Angeles.In social cultural system context, law enforcement as an organizational cultural system routinely experiences behavioral misdeeds of rogue officers. It is well known that police, as an organization, lie, steal, cheat, commit acts of extortion, make false arrests, plant evidence, and even commit murder for personal gain. Police corruption scandals are common (Grant, 2003), but the organization’s first defense is always to say that it is an isolated case of a few “bad apples”, and that it never reflects the core values of the organization and the officers who committed these horrendous acts should have never been hired. Institutional reforms had been made by the Los Angeles Police Department (LAPD) and the Los Angeles County Sheriff's Department to weed out the “bad apples” following the Rodney King beating by LAPD police and the Rampart Police Corruption Scandal. In the report “Rampart Area CorruptionIncident,” Chief of Police, Bernard Parks, recommended pre-employment testingand screening of police officer candidates, in addition to undergo a thoroughbackground check and complete a few other assessment measures, as an ongoingeffort to weed out the “bad apples”. This perspective assumes better hiring practices will help the organization hires people who will be less likely to abuse their authorityas law enforcers. Cases like Eng, Thomas, and many others happened since the Rampart Scandal and Rodney King beating, implicate blaming a few “bad apples” hasnot been working out too well for the American public, and that business isresumed to “normal” at the Department.Pre-employment PsychologicalTesting: Does It Really Help?Though the use of psychological testing for police recruits was first suggested in 1931 by the Wickersham Commission (Dantzker, 2011), police agencies chose not to use it much until a decade ago. The Rampart Scandal in the late 1990s forced the LAPD to undergo a complete reform in personnel selection practices (LAPD, 2000). In response to the Rampart Scandal, the Board of inquiry recommended the use of psychological tests on all new recruits (LAPD, 2000). In compliance with theBoard of Inquiry’s recommendation, the Department opened its firstpsychological service program in 2000 to assist the Personnel Department withthe selection of new recruits. The goal is to weed out unqualified candidates and prevent future police misconduct.The most commonly used personality tests and inventories in departments throughout the United States are the Minnesota Multiphasic Personality Inventory—2 (MMPI-2), the California Psychological Inventory (CPI) , the Sixteen Personality Factor Questionnaire (16PF), Edwards Personal Preference Schedule, and the Inwald Personality Inventory (IPI) (Cochrane, Tett & Vandecreek, 2003). The LAPDuses the MMPI-2 and clinical interview as psychological screening tools intheir selection process, and they use the pass-fail approach to the results,keeping candidates who pass and rejecting those who fail. Interestingly, studies have indicated that although psychological assessment appears to be valued in the selection process, very few candidates are rejected based solely on the results, anddepartments that use a pass-fail approach use psychological assessments to ruleout psychopathology (Cochrane, Tett & Vandecreek, 2003).Data on what type of psychological screening used by the LAPD prior to the Rampart Scandal was not available, but current selection practices appear to be consistent with research findings that suggest qualified candidates are pre-selected through other procedures prior to the psychological evaluation. Therefore, once thecandidates complete the evaluation, few of them are found to be outrightunqualified (Cochrane, Tett & Vandecreek, 2003). In other words, unless a candidate is mentally ill, the LAPD would hire candidates who presenting themselves as extremely masculine and having stereotypical masculine interests. A meta-analysis of MMPI measurements of common personality traits of police officers indicates both male and female police officers gave defensive profiles, with male officers presented themselves as having stereotypic male interests and attitudes, while female officers rejected traditional feminine roles and stereotyped behaviors (Kornfeld, 1995).Results of another meta-analysis on personality profiles of police show many individuals who seek admission into police training programs tend to possess personality traits good for policing, that is self-disciplined (control), socially bold (independent), extraverted, emotionally tough, and low in experienced anxiety (Eber,1991). However, one in four police officers possesses personality associated relatively high levels of schizophrenia, paranoia, psychasthenia, and other symptoms not good for being cops (Lorr & Strack, 1994). These individuals appeared just as tough and independent as the “good” cops, but they were lower in self-control and extraversion and much higher in anxiety (Lorr & Strack, 1994). Furthermore,studies that compared personality traits between cops and criminal have shownthat armed police officers and violent criminals are “two sides of the samecoin, united in an unbreakable bond” (Wisenheimer , 2009), leading to their commissions for active police duties within anti-gang crime units of the police department.ConclusionThe literature on police abuse of authority is vast, and many theories have developed since the Rampart Scandal to give meanings to, and to solve, this deeply rooted social problem. There are plenty reasons why people hate cops. When we as a society see cops as them vs us, we do not allow ourselves to see the big picture that cops reflects the kind of society we created, and therefore we have the responsibility to influence and make positive change.Social culture influences organizational culture, as police officers bring their learned assumptions to judgments and decisions in the workplace, and through this interaction between a corrupt social culture of urban cities and the adaptive culture of organizations that creates the perfect formula for organizational dysfunction. Police corruption is, by and large, supported by social structures and attitudes that are embedded in local cultures. When law enforcement agencies operate under the assumption that a few “bad apples” are to blame for the misconduct, they missed the opportunity to look at the big picture, to identify and analyze problems from outside looking in, and to understand where the problems really stem from, and then have the courage to lead, engage, unite and transform local communities toward social change through positive integration and innovation. There is no room for ego and “us versus them” attitude. Remember that organizational dysfunction is strongly influenced by social structures and attitudes embedded in local societies. Thus, changing organizational cultures requires the transformation of local cultures. The Justice Department and Law Enforcement are doing the best they can under the circumstance. They are the product of our society, and until we change as a society, things will remain the same. If we focus on this, then we would not have free time to sit around hating cops.ReferencesBayley, D., & Perito, R. (2011). Police corruption: What past scandals teach us about current challenges. United States Institute of Peace. Washington, DC: Special Report.Cochrane, R. E., Tett, R. P., & Vandecreek, L. (2003). Psychological testing and the selection of police officers. Criminal Justice and Behavior, 30(5), 511–537.Dantzker, M. L. (2011). Psychological Preemployment Screening for Police Candidates: Seeking Consistency if Not Standardization. Professional Psychology: Research & Practice, 42(3), 276-283.Grant, J. (2003). Assault under Color of Authority: Police Corruption as Norm in the LAPD Rampart Scandal and in Popular Film. New Political Science, 25(3), 385.Kaplan, P. J. (2009). Looking Through the Gaps: A Critical Approach to the LAPD's Rampart Scandal. Social Justice, 36(1), 61-81.Knowles, J., J. (1996). The Ohio Police Behavior Study, Columbus,OH: Office of Criminal Justice Services.Kornfeld, A. D. (1995). Police officer candidate MMPI-2 performance: Gender, ethnic, and normative factors. Journal Of Clinical Psychology, 51(4), 536-540.Lorr, M., & Strack, S. (1994). Personality profiles of police candidates. Journal Of Clinical Psychology, 50(2), 200-207.Martin, C. (1994). Illinois Municipal Officers’ Perceptions of Police Ethics, Chicago, IL: Illinois Criminal Justice Information Authority. Pager, Devah (2007). Marked: Race, Crime, and Finding Work in an Era of Mass IncarcerationChicago: University of Chicago Press. Perry, A. E. (2010). The evolution of policeorganizations and leadership in the United States: potential political and social implications. Law, Policy, and Society Dissertations. Paper 20.http://hdl.handle.net/2047/d20000809.Raphael, S., & Sills, M. (2007). Urban crime, race, and the criminal justice system in the United States. A Companion to Urban Economics, 515–535.Regehr, C., LeBlanc, V., Jelley, R., & Barath, I. (2008). Acute stress and performance in police recruits. Stress & Health: Journal Of The International Society For The Investigation Of Stress, 24(4), 295-303.Simmers, K. D., Bowers, T. G., & Ruiz, J. M. (2003). Pre-employment psychological testing of police officers: the MMPI and the IPI as predictors of performance. International Journal Of Police Science & Management, 5(4), 277-294.Venkatesh, S., A. (2000). American Project: The Rise and Fall of a Modern Ghetto.Cambridge, MA: Harvard University Press.Vygotsky, L. (1986). Thought and language. Cambridge, MA: The MIT Press. Weisburd, D., & others. (2000). Police attitudes toward abuse of authority: Findings from a national study. US Department of Justice, Office of Justice Programs, National Institute of Justice.Wisenheimer, R. (2009). Separated by birth: The personality of armed police andcriminals. Interim findings from a research study. Richard Wisenheimer Crime Research & Advisory Centre.
- Home >
- Catalog >
- Legal >
- Release Form >
- General Release Form >
- free general release form >
- Assumption Of Risk And General Release Form - Harvard University