Harvard Assumption Of Risk And Release: Fill & Download for Free

GET FORM

Download the form

How to Edit Your Harvard Assumption Of Risk And Release Online Easily and Quickly

Follow these steps to get your Harvard Assumption Of Risk And Release edited with the smooth experience:

  • Hit the Get Form button on this page.
  • You will go to our PDF editor.
  • Make some changes to your document, like signing, erasing, and other tools in the top toolbar.
  • Hit the Download button and download your all-set document into you local computer.
Get Form

Download the form

We Are Proud of Letting You Edit Harvard Assumption Of Risk And Release With a Simplified Workload

Get Started With Our Best PDF Editor for Harvard Assumption Of Risk And Release

Get Form

Download the form

How to Edit Your Harvard Assumption Of Risk And Release Online

If you need to sign a document, you may need to add text, give the date, and do other editing. CocoDoc makes it very easy to edit your form fast than ever. Let's see how this works.

  • Hit the Get Form button on this page.
  • You will go to our online PDF editor page.
  • When the editor appears, click the tool icon in the top toolbar to edit your form, like highlighting and erasing.
  • To add date, click the Date icon, hold and drag the generated date to the target place.
  • Change the default date by changing the default to another date in the box.
  • Click OK to save your edits and click the Download button when you finish editing.

How to Edit Text for Your Harvard Assumption Of Risk And Release with Adobe DC on Windows

Adobe DC on Windows is a useful tool to edit your file on a PC. This is especially useful when you deal with a lot of work about file edit without network. So, let'get started.

  • Click the Adobe DC app on Windows.
  • Find and click the Edit PDF tool.
  • Click the Select a File button and select a file from you computer.
  • Click a text box to make some changes the text font, size, and other formats.
  • Select File > Save or File > Save As to confirm the edit to your Harvard Assumption Of Risk And Release.

How to Edit Your Harvard Assumption Of Risk And Release With Adobe Dc on Mac

  • Select a file on you computer and Open it with the Adobe DC for Mac.
  • Navigate to and click Edit PDF from the right position.
  • Edit your form as needed by selecting the tool from the top toolbar.
  • Click the Fill & Sign tool and select the Sign icon in the top toolbar to customize your signature in different ways.
  • Select File > Save to save the changed file.

How to Edit your Harvard Assumption Of Risk And Release from G Suite with CocoDoc

Like using G Suite for your work to complete a form? You can edit your form in Google Drive with CocoDoc, so you can fill out your PDF with a streamlined procedure.

  • Go to Google Workspace Marketplace, search and install CocoDoc for Google Drive add-on.
  • Go to the Drive, find and right click the form and select Open With.
  • Select the CocoDoc PDF option, and allow your Google account to integrate into CocoDoc in the popup windows.
  • Choose the PDF Editor option to open the CocoDoc PDF editor.
  • Click the tool in the top toolbar to edit your Harvard Assumption Of Risk And Release on the Target Position, like signing and adding text.
  • Click the Download button to save your form.

PDF Editor FAQ

What is the difference in investment philosophy between Wealthfront, Betterment, and Hedgeable?

I am the Co-Founder of Hedgeable so I can explain it from our perspective.On a high level, the main philosophical difference is between a Buy and Hold (in investing terms Modern Portfolio Theory or MPT) approach used by Wealthfront and Betterment (product), and Dynamic Asset Allocation (DAA) used by Hedgeable. (What is dynamic asset allocation?)MPT does not take into account limiting downside and portfolio drawdowns, while DAA does. DAA is built on top of research on Modern Portfolio Theory from 1952, so let's first examine what it is and what it isn't, then look at how DAA improves on its flaws.MPT BackgroundMPT was first introduced by economist Harry Markowitz in 1952 (Markowitz – “Portfolio Selection” in The Journal of American Finance). The framework has seen various iterations over the years, but has remained largely the same. Many of the core ideas invoked by Markowitz still drive portfolio management in 2015. Target date funds, which are retirement-oriented mutual funds that utilize MPT, are extremely popular in the 401(k) market, for example. Target date funds accounted for over $700 billion at the end of 2014 and are expected to encompass over $2 trillion by 2019.MPT has two main components:Capital Market Assumptions & Allocation ConstraintsMean-Variance OptimizationCapital Market Assumptions & Allocation ConstraintsMoney managers make assumptions for all relevant asset classes, which are used as inputs for the mathematical model that determines portfolio construction. These assumptions include the expected return of a given asset class, the expected risk (measured by standard deviation), and the expected correlation coefficient for each asset class pair.These assumptions are typically similar to the long-term historical averages for a given asset class, but the aim is to capture current and forward-looking market conditions as accurately as possible. Most managers update their assumption annually. The following shows an example of 10-year capital market assumptions for 2015 released by an investment consulting firm:Unconstrained MPT portfolio construction could potentially lead to allocations that are not optimal for an investor’s risk profile or preferences. To avoid this issue, minimum and maximum constraints are placed on each asset class such that a given allocation falls within a reasonable or acceptable range. This also helps to ensure a sufficient level of diversification in the resulting portfolios.Mean-Variance OptimizationUsing the capital market assumptions and accounting for the allocation constraints, one can generate potential portfolios with varying weights for each asset class. These portfolios can then be charted on a scatterplot with the x-axis measuring expected risk and the y-axis measuring expected return. The process of mean-variance optimization (MVO) involves gleaning information from such a scatterplot, where mean refers to the mean expected return and variance refers to expected risk.An optimal portfolio, theoretically, is one that has infinite return and no risk. Since that is not realistically attainable, optimization simply means maximizing the return generated by a given level of risk. Graphically, the most optimal portfolios occupy the upper-left edge of the aforementioned scatterplot. Within the MPT framework, this edge is called "the efficient frontier," an example of which is shown below:The efficient frontier includes the portfolios with the least amount of risk, the greatest return, and all those in between that maximize return for each level of risk. The goal of an MPT-based money manager is to construct a portfolio that falls on the frontier.Passive ManagementAfter a portfolio is constructed, common practice is to assume a passive management approach. That means the manager only intervenes to rebalance the portfolio if allocations diverge from the original weights. Some managers rebalance on a periodic schedule, such as quarterly or annually, and others rebalance based on thresholds for each asset class, such as when a holding strays 2% or 5% from the model weight.Whatever the timetable is for rebalancing, the implementation is the same. The portfolio manager will sell the winners, and buy the losers to bring the portfolio back in line with the optimal strategic weights that were set. This management style is pervasive in the asset management industry and affects the vast majority of Americans invested in advised accounts. Because it is so common, most Americans (and even most investors) believe it is the best and only way to invest.Flaws in MPTThere are some key shortcomings when applied to the reality of investing in today’s world. These shortcomings stem from three high-level assumptions:Diversification works: cross-asset correlations remain constant over a given periodThe financial crisis of 2007-2009 marked the fallibility of the new global paradigm. The deep interdependence led to a widespread downward spiral. In the graph below, we examine the correlation of various asset classes historically and during the crisis:Drawdowns do not matter: portfolios recover quicklyAs the drawdown becomes more severe, the required return becomes exponentially larger based on the concurrent increase of the numerator and decrease of the denominator:Given this information, drawdown mitigation presents itself as an optimal way to promote account growth. A portfolio that experiences lower drawdowns will grow larger than one that experiences higher drawdowns, ceteris paribus. Consider two hypothetical portfolios – one that mirrors the return of the S&P 500, and one that has the same return series with the exception of missing a negative data point. The latter portfolio will grow larger and the difference will compound as time progresses.Constant market growth: markets grow in perpetuityA $100,000 USD investment in Japanese stocks in 1972 was worth over $3,000,000 just 17 years later. However, the past quarter century is a different story entirely:A similar $100,000 invested in 1989 would be worth just $63,000 in 2015. The Japanese market, which represents the 3rd largest economy in the world, has suffered multiple large drawdowns and the investment would have yet to regain its peak value (or even come close, for that matter). Proponents of MPT would argue that over a long period of time the market is certain to recover, but that recovery is not at all apparent – even after more than 25 years. There is a possibility that this market stays in a perpetual drawdown for the foreseeable future, and a further possibility that it never recovers; that is also the case for any financial market in the world, even one with a strong history such as the US.Dynamic Asset AllocationA dynamic philosophy has four primary goals:Limit DrawdownsReduce Volatility and BetaTrack Benchmarks on the upsideAchieve superior risk-adjusted returns over the long-termHedgeable attempts to do this by melding two investment approaches – that of MPT-based passive managers, as described earlier, and that of sophisticated alternative managers like Bridgewater Associates.Dynamic Asset AllocationThe primary driver of Hedgeable’s investing philosophy is dynamic asset allocation, defined as a rules-based allocation strategy that includes responsiveness to market environments.Dynamic asset allocation differs from a passive or indexing strategy, which is static and not responsive (periodic rebalancing notwithstanding). Rebalancing within a dynamic framework is conditional rather than pre-determined, and has a much greater impact on portfolio composition over time. This also differs from an active strategy that uses forward-looking positions to try to take advantage of future market conditions. Dynamic allocations as described here are reactive and not predictive.Hedgeable MethodologyHedgeable's methodology utilizes a time-varying version of the CPPI framework [source: Hamidi, Maillet & Prigent, 2008]. The time-varying portion for Hedgeable is the level set for the floor, which changes based on market conditions and governs how rebalancing occurs.Based on quantitative risk measures looked at daily in the portfolio, Hedgeable moves between risky and safe assets. The risky asset is weighted on a scale that ranges from 100% to 0% – meaning that the portfolio can be fully invested or fully protected at any given time, and can move up or down from each level in between. This allows Hedgeable to ladder out of a position as it becomes more risky, and ladder in as it becomes less risky, in order to achieve an optimal mix for clients.Hedgeable further extends this framework for entire portfolios rather than for a single risky asset. In this framework, Hedgeable can construct a diversified portfolio using MPT principles and overlay constant proportion risk management. Take a hypothetical portfolio such as the one below:Instead of looking at the portfolio as a whole, Hedgeable looks at each holding individually to apply risk management, on a daily basis. This allows three things:Set optimal risk thresholds for each asset class – commodities are much more risky than bonds, so different risk thresholds should apply.Limit transactions and transaction costs – by separating a portfolio into buckets, we avoid unnecessary transaction costs and maximize the benefit of every trade.Maintain portfolio integrity – each asset class has a strategic weight that acts as an anchor, so a given portfolio doesn’t gets more risky than intended (only less risky, when necessary) and no leverage is ever used.In the above example, Hedgeable can stay 50% in stocks and 30% in bonds and sell out of commodities if commodities slide into a bear market, in order to protect that portion of the portfolio. The portfolio could be up to 20% invested in a safe asset (e.g. cash) depending on just how risky commodities get.Hedgeable can assess each asset class this way, independently of each other. There is no reason to move out of stocks (for example) if commodities are risky, so this is the most efficient approach to cut downside risk while retaining upside potential. During severe crisis periods, it is possible for the entire portfolio to be allocated to the safe asset.This framework can be applied to any type of portfolio – conservative or aggressive, diversified or concentrated, ETF-based or stock-based. Hedgeable only use liquid and high-volume securities for core portfolios, so there is minimal friction in shifting portfolio allocations as market conditions shift.As conditions change, Hedgeable weights recent data more heavily than older data. This is to optimize responsiveness, as well as eliminate the aforementioned “cashing out” effect. Looking at more frequent and more recent data helps the strategy avoid getting stuck in capital protection [source: Giese, 2014].Sources:“Dynamic Strategies for Asset Allocation” published in 1988 by Andre Perold of Harvard Business School and Nobel Prize winner William Sharpe."Enhanced Balanced Portfolios With CPPI Methodologies" published in 2008 by Francesco Rossi of Pioneer Investments."CPPI – constant proportion portfolio insurance: protection with dynamic allocation control" published in 2006 by Martin Ermini from BNP Paribas."A Time-varying Proportion Portfolio Insurance Strategy Based on a CAViaR Approach" published in 2008 by Benjamin Hamidi, Bertrand Maillet, Jean-Luc Prigent, prominent financial academics in France."Risk-Budget Indices" published in August 2014 in the Journal of Indices by Guido Giese, head of Indexes at RobecoSAM Group."Optimal Investment Strategies for Controlling Drawdowns" published in Mathematical Finance, July 1993, by Dr. Sanford Grossman and Dr. Zhongquan Zhou, professors at The Wharton School, University of Pennsylvania.

Do you think President Obama will ever release his college transcripts/grades?

No - he probably won’t as there is no need to, especially AFTER his presidency. There are a number of problems with this very question and the answers given in support of “conspiratorial” assumptions inherent to it. I wrote such a long comment to one answer, I have edited to make it an answer…This question is implying that Obama was a poor student and if we only saw his transcripts, we’d see that, or see something incriminating. There is no precedent for presidential nominees to show their college transcripts, so holding Obama (or Trump, or whoever) to a new standard seems pretty arbitrary.First, there is no evidence Obama was unqualified to enter Harvard nor that he was a weak student while there - indeed graduated Magna Cum Laude - this is straight up GPA, likely the top 5% of his class. Both of his parents achieved graduate degrees, so why would you think their son would not be a strong student? Obama senior himself earned a master’s degree from Harvard - the “legacy” rule also gave him a “leg up” on the competition in addition to his minority status - many under-qualified “white” students get a chance this way as well (recall how W Bush got into Yale?).Next, Obama graduated Columbia in 1983, but worked for a few years. His work experience may have made him an appealing applicant to start in Fall of 1988 - this time frame according to one answer had less “racist” admissions policies than 81–87. Even IF Harvard gave a slot he was not worthy of academically, he proved to be very worthy of the opportunity. After year one, he WON the competition (based on GPA and an essay) to become editor of the Harvard Law Review - yes, the first black student to do so - you could claim it was “given” to him, but there is NO evidence. Would Harvard Law just give such an important role to fulfill some arbitrary need for the appearance equity? It was not mandated the editorship be given to a minority, and why risk decline of their important law journal just to appear equitable?IF the editorship was unfairly given, others would likely have filed grievance (these are law students after all). AND, turns out he was highly regarded in that spot (Cruz won the first Editorship of the premier of Harvard Latin Law Review, but no one seems to question his intelligence or the legitimacy of his degree). While serving as Editor, Obama graduated with honors, Magna Cum Laude. Although I am a professor, and was a strong student I never graduated any of my programs with honors (top 10%), nor did Donald Trump.Obama did not earn a “Black Harvard degree” any more than Cruz earned a “Latino Harvard degree”. If you want to make some claims his grades were changed in any of his transcripts, you need more than just logic or suspicions. I have seen a number of students admitted into programs based on legacy, favors or minority status - those not worthy (of any ethnicity) usually floundered and if they completed the programs were short of honors and high level assignments awarded by peers or professors. Heck, sometimes I wonder if I was given the favor of entry into my graduate program because my undergraduate mentor had completed the same program years earlier; I struggled, but never wavered and completed my degree on schedule but was no where near the kind of “star” Obama was. Whether Obama used his academic strength to be a good or effective president is a completely different question and actually one answered better with some time to see the long-term impact of his policies.

In a total nuclear exchange where the entire worlds arsenals are used, how long would the nuclear winter last and would we survive?

Flashback to the 1980’s … Mutually Assured Destruction (MAD)Note: For an updated and more complete answer please go here Allen E Hall's answer to Who would win in a war between Russia and the US?A lot has changed…..In 2016 a nuclear winter isn't possible even in an all out nuclear war. This is because both the quantities and yield of the world's nuclear arsenals has dropped precipitously from the all time high in 1986. The arsenals today are only 20% the size they were in 1986 and the total megatons available is less than 10% of the peak.Surprisingly quietly, the USA and Russia have dismantled over 50,000 nuclear weapons over the past 30 years. The nuclear materials from these bombs and other stockpiles of weapons grade materials, was recycled and used in nuclear power generation over the past 20 years. A fact that few may be aware of, the situation actually crashed the uranium market in the early 2000’s. The glut of available fuel brought the open market trading value down from $20 dollars a pound to near $2 per pound at that time. So a lot has changed from the time when many of us can remember the very real threat of mutually assured destruction.Multi Megaton Weapons Now ObsoleteWhat has changed that the world no longer is building megaton weapons? The need for multi-megaton weapons was the result of low accuracy of warhead deliver on target…. we needed a sledgehammer approach to take out hardened targets and the way that was done was through very high yield bombs >=5 mt typically. The average nuclear weapon size today in 2016 is about 443kt at full yield but a large portion of those bombs can be adjusted in the field to a very small fraction of their potential yield.Today the accuracy of on target delivery has massively improved ..we hit what we aim for. This means we need less hammer to do the same job. In the 1980’s the development of earth penetrating rounds was another game changer. Not only were we on target but now we could penetrate hundreds of feet of earth and concrete before detonating the warhead. This allowed a 100 kt weapon to do the damage of a >1 mt surface detonation. This is the primary method now for targeting hardened targets and is the final driver for smaller yield bombs.The net effect of the use of EPW’s (Earth Penetrating Weapons) is a reduction in the number of casualties as compared with the number of casualties from a surface burst. This is primarily due to a 96% reduction in the weapon yield needed using an EPW. The greater coupling of the released energy to the ground shock for a buried detonation is the same as a surface burst with 25 times the explosive energy. For rural targets, the use of a nuclear earth-penetrator weapon is estimated to reduce casualties by a factor of 10 to 100 relative to a nuclear surface burst of equivalent probability of damage.. [1]The average warhead size in the USA arsenal is 330 kt. The Russian average is higher, but not enough to change this outcome. To cause a nuclear winter the debris clouds and smoke have to be elevated above the troposphere into the high stratosphere. Any debris or smoke that is released into the troposphere (below 70,000 feet) quickly rains out in the weather within a few days to a week or so max. Nuclear weapons yields do not affect the environment on a linear scale , that is to say that a 1 megaton bomb, even though it is 10 x more energy than 100 kt bomb, doesn't mean it produces 10 x more destruction. Thermal radiation decays as the inverse square while blast decays as the inverse cube of distance from the detonation point. Much of that extra heat and energy goes straight up and drops off quickly as distance is increased from the point of detonation. With smaller yields the energy isn't enough to breach the stratosphere, and for bombs that size the earth has its own protection mechanism for particles released in the troposphere called the weather, and it is extremely efficient.The only way to get particles to stay aloft longer is to blast them considerably higher than 70,000 feet. **The reason this won’t happen today is that the world has eliminated megaton-size bombs almost completely, and shortly it will be complete as the last ones are dismantled. Russia and the US both have eliminated megaton size weapons from the high-alert strategic forces (ICBM’s & SLBM’s).To get anything above 70,000 feet you need yields substantially above 1 megaton. The bombs deployed today will throw debris up 50,000 - 60,000 feet into the atmosphere and all of that will rain back down to the earth in hours and days later near the point of detonation.(** B-83 variable yield ≈ 20 kt - 1.2 mt slated for retirement in 2025. This is a gravity bomb and is also being considered as a reserve against an asteroid impact. Marshall Space Flight Center have developed designs for an array of asteroid interceptors wielding 1.2-megaton B83 nuclear warheads. 650 units in reserve but not alert status)As we have all heard in the past that there were enough nuclear weapons to kill everyone several times over, let me put that myth to rest. Hypothetical scenario for maximum damage: Starting in an arbitrary corner of the USA (or if you prefer … Canada) take the entire world's inventory of nuclear weapons (10,000 active and stockpiled) and place each one in its own circle covering 100 square miles. Using a world average yield size of 500 kt, this sets up the scenario for maximum destruction. If all the warheads are then elevated to 6000 feet, the height for maximum destruction and fatalities, and then detonated. Each bomb would make a 10 km radius of destruction from its center with 3rd & 2nd degree burns on the outskirts of this radius. The fallout would be minimal with only air bursts, most dangers would be gone within hours or days after the blasts. Using every bomb in existence today as laid out in this hypothetical scenario, the area of assured destruction would only amount to 1/3 of the USA’s total land mass. If it was Canada, many might not even notice. That’s it. On a global scale that isn’t hardly a scratch at 1/42 of the world's total land mass.Firestorms and other bad science that led to the wrong conclusions.A lot of new knowledge on pyrocumulonimbus cloud formation and soot into the lower stratosphere is still being interpreted. Until the early 2000’s it was thought the boundary layer between the troposphere and stratosphere presented a greater barrier for smoke, however, smoke columns rising into the lower stratosphere have been observed. This indicates that there is a long term lasting effect, but to what extent is still unanswered.A 2010 study by the American Meteorological Society is the first modern attempt to quantify these effects. In their report, they tracked the effects of 17 stratospheric smoke plumes in 2002. What they found is that the average time that the smoke plumes’ presence in the stratosphere was detectable, was only about 2 months. The report indicates that particles of carbon soot start to clump together at some point after interacting with sunlight and then drop out of the stratosphere quickly.[2] This happens in weeks not in years, a major contradiction to the premise of nuclear winter theories. What isn't known is there a tipping point of equilibrium that would keep the soot aloft if there was enough of it. So like many things, there is a certain element of the unknown in this.What is known is that the TTAPS study, made famous by Carl Sagan and his team, used exaggerated volumes of soot and smoke in their model. Their assumptions for a nuclear winter were significantly off in their calculations. Key government studies since then have shown that the available combustible materials used in the models in TTAPS were significantly overstated and this has flawed all the studies since that have used the TTAPS study as the basis of their work.The nuclear winter theory relies heavily on the worst case scenario of many of the events that would unfold during a nuclear exchange and as such exaggerates the effect dramatically. [3] A contemporary example of prediction not accurately modeling reality is the forecast effects of the Iraqis setting 600 oil rigs ablaze in 1991.Following Iraq's invasion of Kuwait and Iraqi threats of igniting the country's 800 or so oil wells were made, speculation on the cumulative climatic effect of this, presented at the World Climate Conference in Geneva that November in 1990, ranged from a nuclear winter type scenario, to heavy acid rain and even short term immediate global warming.As threatened, the wells were set ablaze by the retreating Iraqis in March of 1991 and the 600 or so successfully set Kuwaiti oil wells were not fully extinguished until November 6, 1991, eight months after the end of the war During this time they consumed an estimated six million barrels of oil daily at their peak intensity.In articles printed in the Wilmington morning star and the Baltimore Sun newspapers of January 1991, prominent authors of nuclear winter papers — Richard P. Turco, John W. Birks, Carl Sagan, Alan Robock and Paul Crutzen —together collectively stated that they expected catastrophic nuclear winter like effects with continental-sized effects of "sub-freezing" temperatures as a result of the Iraqis going through with their threats of igniting 300 to 500 pressurized oil wells that could subsequently burn for several months.[4]Carl Sagan later conceded in his book The Demon-Haunted World that his predictions obviously did not turn out to be correct: "it was pitch black at noon and temperatures dropped 4–6 °C over the Persian Gulf, but not much smoke reached stratospheric altitudes and Asia was spared.”The problems with the models that started the nuclear winter debate, the models used by Sagan and other teams of scientists at that time, is obvious when you look at the detail. The analysis was done at extremely low resolution and with no feedback loops. It was a 2D model, not a 3D model, so the volume and altitude of particles, heat flux, and fuel "mass loading" (the amount of fuel per square meter) were never actually calculated. The numbers were made uniform and plugged in as a single result for the entire world. So the heat flux, fuel “mass loading”, soot, smoke and debris was uniform no mater if the city was Fargo North Dakota or Los Angeles. It was inherently wrong and fatally flawed. [5][6]The atmospheric scientist tasked with studying the atmospheric effect of the Kuwaiti fires by the National Science Foundation, Peter Hobbs, stated that the fires' modest impact suggested that "some numbers (used to support the Nuclear Winter hypothesis)... were probably a little overblown.”[7]In a paper by the United States Department of Homeland Security finalized in 2010, fire experts stated that due to the nature of modern city design and construction, with the US serving as an example, a firestorm is unlikely after a nuclear detonation in a modern city. This is not to say that fires won't occur over a large area after a detonation, but that the fires would not coalesce and form the all-important stratosphere punching firestorm plume that the nuclear winter papers require as a prerequisite assumption in their climate computer models. Additional recent studies on smoke columns indicate that nearly every possible fire scenario results in little to no stratospheric injection of smoke..[8]The nuclear bombing of Nagasaki for example, did not produce a firestorm. This was similarly noted as early as 1986-88, when the assumed quantity of fuel "mass loading" in cities underpinning the winter models was found to be too high and intentionally creates heat fluxes that loft smoke into the lower stratosphere, yet assessments "more characteristic of conditions" to be found in real-world modern cities, had found that the fuel loading, and hence the heat flux that results from burning, would rarely loft smoke much higher than 4 km.[9]The scenarios contributing to a firestorm are also dependent on the size of bombs being used. Only bombs in the 1-megaton range and higher would ignite a sufficiently large area for firestorms to coalesce crossing over from sparsely located high fuel-load areas into these lower fuel-loaded areas in a mixed city model, such as Nashville.[10] [11]Russell Seitz, Associate of the Harvard University Center for International Affairs, argues that the winter model's assumptions give results which the researchers want to achieve and is a case of "worst-case analysis run amok". Seitz criticized the theory for being based on successive worst-case events.[12]Notes from “Disaster Preparedness, An International Perspective”: “If the amount of smoke assumed in the “nuclear winter” report (Science, v222, 1983, pp1283-92) were decreased by a factor of 2.5, the climatic effect would probably be trivial. In considering the actual terrain that surrounds most likely targets, the probable type of explosions (ground bursts against hardened military facilities), the overlapping of targets, and conditions that could reduce the incendiary potential of the thermal pulse, critics of the report believe that the quantity of smoke from non-urban fires has probably been overestimated by at least a factor of ten (Cresson Kearny, Fire Emissions and Some of Their Uncertainties, Presented at the Fourth International Seminar on Nuclear War, Erice, Sicily, August 19-24, 1984). Rathjens and Siegel (Issues in Science and Technology, v1, 1985, pp123-8) believe there would likely be four times less smoke and eight times less soot from cities than estimated in the National Research Council study.”[13]Putting the fires of a nuclear war in another perspective. Every year on earth, wildfires consume 350,000,000 - 450,000,000 hectares of forests, grasslands and structures and results in an average of 339,000 deaths worldwide. [14] This is equal to 1,700,000 square miles burned every year worldwide, nearly half the size of the entire Unites States. Earlier in this document, I laid out a hypothetical scenario where every nuclear bomb in existence, excluding ones listed as retired, are spread out equally at a density of 1 bomb every 100 square miles (10,000 bombs x 100 square miles = 1,000,000 square miles). Under that scenario, the bomb coverage only extends over 1/3 of the land mass of the USA (the USA is 3,800,000 square miles). The world burns more already every year without sending the climate into a nuclear winter. This also is equal to half the CO2 released from burning fossil fuels annually.[15] Wild fires release massive amounts of energy on a scale equivalent to nuclear weapons. The Chisholm Fire, a man-caused forest fire in Edmonton, Alberta, Canada in 2001 released the equivalent energy of 1200 Hiroshima atomic detonations.[16] The firestorm after the bombing of Hiroshima released 200 times the energy of the atomic bomb itself.Taking all that into consideration and taking the available megatonnage in today's arsenals and and adjusting the implied atmospheric load of carbon black soot you might end up with 5 teragrams aloft in the lower stratosphere resulting in a 2–3 °C drop for several months to, worst case, several years. Not quite a nuclear winter, barely a nuclear fall… and even that is debatable since evidence suggests a much shorter time of smoke suspension in the stratosphere and that the premise on uncontrolled fire storms is unfounded based upon actual observations of the bombs dropped in 1945. While Hiroshima did experience a firestorm Nagasaki did not. Nagasaki was a city with much more combustible material than most modern day cities. The great flaw with the original nuclear winter models is that it assumed the same high loading of fuel for all cities and that firestorms would occur at all those locations. A firestorm isn’t assured and is considered unlikely in modern cities, and thus the theory is flawed from top to bottom.An interesting note about several major recent reports to the contrary of my conclusion, and even ones going back 10 years. None of these reports question the fuel loading and levels of atmospheric smoke generated. They all seem to use the original basis as put forth by Carl Sagan’s team, even though Sagan himself admitted his model did not work. The footnote here will take you to an example of the poor quality models still being pushed as real science. A Rutgers 2010 report that references the work by Sagan offers no explanation for the mechanism of smoke and soot transport into the stratosphere. Quality work is not guaranteed just because the sources are listed as a professionals in this field. Healthy skepticism is your friend, use it.[17]So nuclear winter was always a stretch because the science was unfounded and we never had enough high-yield bombs in reality to cause it ever, but for sure in 2016 because we don’t have any in the high-yield range required within the active arsenals of the nuclear nations at all (other than a small quantity of bombs held by China, around 50 and not enough to change these outcomes).Final ThoughtsI have always been intrigued by the specter of nuclear weapons and the power of the atom. I have a not insignificant set of reference books I have collected over the years. The ones from the late 1940’s and early 1950’s are quite amusing; we did not know what we were really dealing with back then.We have come a long way since the era of Dr. Strangelove.I have come full circle in my understanding and no longer buy into the popular myths because the clear science is there that tells you otherwise. However, having this knowledge may not be a blessing. Knowing that nuclear weapons are not the end of mankind in 2016 isn’t necessarily a great truth to latch onto. The pain and suffering that would transpire from the use of these weapons should always remain a strong deterrent from their use.Making the unthinkable thinkable, was there some sanity in the insanity of MAD?In dismissing the notions of nuclear winter and MAD (mutually assured destruction), could we be making the use of these weapons more palatable as a tool of political and ideological foreign policy enforcement? A quick fix to the next ISIS where collateral damage is deemed acceptable? Is that something we can manage as a civilization? Are our values within society as a whole, strong enough to kill any temptation in the future if using these weapons seems like a quick fix for an immediate problem? Is a limited exchange something to be seriously considered? Or are we better off letting our imagination embrace a nightmare, a dark vision of reality, a nuclear winter, with complete conviction without regard to the truth?Note: I make no claim that I am right… I only offer an analysis with considerations for details and data overlooked by others … sometimes intentionally. Please do your own due diligence and make an educated determination for yourself.Additional Notes and Recommended Follow-on ReadingObama committed to a major nuclear triad upgrade in order to get the Senate support of the New Start Treaty in 2010. This article does a good job questioning the reasons why we are planning to spend $1 trillion on nuclear weapons systems upgrades over the next couple of decades. It is a worthwhile read, and the access is free with registration. https://www.foreignaffairs.com/articles/americas/2016-08-01/rethinking-nuclear-policyAlso: Allen E Hall's answer to Are we in less danger or more danger today from a 1st strike?A Nuclear Conflict with Russia is Likelier Than You Thinkand with unplanned yet uncanny timing …on 60 minutes tonight 9/25/2016 Risk of nuclear attack risesA well thought out and compelling Harvard report “The end of MAD” argues that America’s technological edge and the reduced nuclear arsenals are actually compelling the USA towards a first strikeThis article makes three empirical claims. First, the strategic nuclear balance has shifted dramatically since the end of the Cold War, and the United States now stands on the cusp of nuclear primacy. Second, the shift in the balance of power has two primary sources: the decline of the Russian nuclear arsenal and the steady growth in U.S. nuclear capabilities. Third, the trajectory of nuclear developments suggests that the nuclear balance will shift further in favor of the United States in the coming years. Russia and China will face tremendous incentives to reestablish mutual assured destruction, but doing so will require substantial sums of money and years of sustained effort. If these states want to reestablish a robust strategic deterrent, they will have to overcome current U.S. capabilities, planned improvements to the U.S. arsenal, and future developments being considered by the United States. U.S. nuclear primacy may last a decade or more[18]If this becomes a trend, the nuclear winter bit might need another take: Architects design 'world's tallest' wooden skyscraperFootnotes[1] The National Academies Press[2] http://journals.ametsoc.org/doi/pdf/10.1175/2010BAMS3004.1[3] http://www.tandfonline.com/doi/pdf/10.1080/02786828908959219[4] Doomsday Scenarios[5] http://www.atmos.washington.edu/~ackerman/Articles/Turco_Nuclear_Winter_83.pdf[6] http://www.junkscience.com/wp-content/uploads/2016/04/Nuclear-winter_MetAtmPhys1988.pdf[7] It Happened Here[8] https://www.remm.nlm.gov/PlanningGuidanceNuclearDetonation.pdf[9] http://personalpages.manchester.ac.uk/staff/S.Utyuzhnikov/Papers/AMM_SU.pdf[10] http://www.dtic.mil/dtic/tr/fulltext/u2/a240444.pdf[11] Nuclear Disasters & The Built Environment[12] In from the cold: 'nuclear winter' melts down[13] http://www.physiciansforcivildefense.org/PDF/5.pdf[14] Wildfires kill 339,000 people per year: study[15] Global Wildfires, Carbon Emissions and the Changing Climate - Future Directions International[16] http://www.atmos-chem-phys.net/6/5247/2006/acp-6-5247-2006.pdf[17] http://climate.envsci.rutgers.edu/pdf/WiresClimateChangeNW.pdf[18] http://belfercenter.hks.harvard.edu/files/is3004_pp007-044_lieberpress.pdf

People Like Us

The best thing about this is that it is safe for users. They feel comfortable because of the format and professionalism

Justin Miller