Pre And Post Waste Worksheets: Fill & Download for Free

GET FORM

Download the form

How to Edit and draw up Pre And Post Waste Worksheets Online

Read the following instructions to use CocoDoc to start editing and filling out your Pre And Post Waste Worksheets:

  • To start with, look for the “Get Form” button and click on it.
  • Wait until Pre And Post Waste Worksheets is loaded.
  • Customize your document by using the toolbar on the top.
  • Download your finished form and share it as you needed.
Get Form

Download the form

The Easiest Editing Tool for Modifying Pre And Post Waste Worksheets on Your Way

Open Your Pre And Post Waste Worksheets Without Hassle

Get Form

Download the form

How to Edit Your PDF Pre And Post Waste Worksheets Online

Editing your form online is quite effortless. It is not necessary to install any software via your computer or phone to use this feature. CocoDoc offers an easy tool to edit your document directly through any web browser you use. The entire interface is well-organized.

Follow the step-by-step guide below to eidt your PDF files online:

  • Browse CocoDoc official website on your laptop where you have your file.
  • Seek the ‘Edit PDF Online’ option and click on it.
  • Then you will open this tool page. Just drag and drop the file, or choose the file through the ‘Choose File’ option.
  • Once the document is uploaded, you can edit it using the toolbar as you needed.
  • When the modification is completed, press the ‘Download’ button to save the file.

How to Edit Pre And Post Waste Worksheets on Windows

Windows is the most conventional operating system. However, Windows does not contain any default application that can directly edit file. In this case, you can install CocoDoc's desktop software for Windows, which can help you to work on documents effectively.

All you have to do is follow the steps below:

  • Install CocoDoc software from your Windows Store.
  • Open the software and then import your PDF document.
  • You can also import the PDF file from OneDrive.
  • After that, edit the document as you needed by using the different tools on the top.
  • Once done, you can now save the finished form to your computer. You can also check more details about how to edit on PDF.

How to Edit Pre And Post Waste Worksheets on Mac

macOS comes with a default feature - Preview, to open PDF files. Although Mac users can view PDF files and even mark text on it, it does not support editing. By using CocoDoc, you can edit your document on Mac easily.

Follow the effortless instructions below to start editing:

  • At first, install CocoDoc desktop app on your Mac computer.
  • Then, import your PDF file through the app.
  • You can upload the file from any cloud storage, such as Dropbox, Google Drive, or OneDrive.
  • Edit, fill and sign your template by utilizing this help tool from CocoDoc.
  • Lastly, download the file to save it on your device.

How to Edit PDF Pre And Post Waste Worksheets via G Suite

G Suite is a conventional Google's suite of intelligent apps, which is designed to make your work faster and increase collaboration with each other. Integrating CocoDoc's PDF document editor with G Suite can help to accomplish work handily.

Here are the steps to do it:

  • Open Google WorkPlace Marketplace on your laptop.
  • Look for CocoDoc PDF Editor and download the add-on.
  • Upload the file that you want to edit and find CocoDoc PDF Editor by choosing "Open with" in Drive.
  • Edit and sign your template using the toolbar.
  • Save the finished PDF file on your computer.

PDF Editor FAQ

There are only 3 months left in the 10th boards exam and I have not studied anything. What should I do?

Hi,First of all Take a deep breathe …. keep your eyes close for five minutes.Just do it!!! Don’t just read ….Now, Three months are left ….Assuming that you don’t take off now and don’t take unnecessary breaks and don’t waste time .Follow few more steps…Uninstall social media app without telling your friends.Don’t even try to access them from browser .Handover your mobile to your parents. Incase you need them for studying, fix the timing of studying on phone. Try to collect all your doubts and post them at once instead of picking up your phone every five minutes for searching solution and then just getting lost in another world .Organize your study space.Know the syllabus and proper weightage of each chapter.NCERT is bible … Just gulp down each and every line. ( Recommending only for social science)Try finish chapters one by one , and maximum one a day, even if you take 2–3 days for a single chapter you have sufficient time for revising it back .Practice Mathematics everyday for 2 -3 hours .Go through your Unit tests, Assignments , worksheets and pre-board answer sheets . Analyze them. See your mistakes and rectify them. If possible do a SWOT analysis.Make sure you practice sample papers . Not necessarily that one full paper but you can go through them section-wise or Topic-wise.These online classes have surely impacted the students that too negatively and it’s high time that we realize that instead of praying for exam delay and syllabus reduction we do our part. Now some tips for online classes -Don’t sit passively, try to interact with teacher.Don’t play games or scroll your instagram feed .Make up your mind that you wouldn’t look for the answers on the net during exam.Take them seriously .Sharing one of my post on Memory Tips :Memory TipsMemory Tip 1: Try Teaching the concepts to your friends or maybe you can act Teacher as teach your imaginary students this way , you will memorise the concept and side by side gain some confidence.Memory Tip 2 : Take a nap and then learn the most difficult topic, the learning speed is atleast X20 more, than continuous learning. Don’t try to memorise too much chapters in a single day . You will end up forgetting all the concepts .Memory Tip 3 : This works best if you practice it regularly , Revise the concept before going to bed and the first thing in the morning .Recommendations -Apps - a.Class 10th CBSE Mukesh kaushikb. PhotoMath - For understanding complex calculations.Books - Here I am giving a link of my previous post.Guides and Books .Also you can checkout my space -Cracking 10th CBSE Board for more guidance .I know that these are old school methods and you need strong will to follow them all. You may not be able to follow them religiously but give your 100%. Your Hardwork will definetly pay off.Always Remember - It’s Never too late.Sorry for grammatical errors .Signing offAayushi….

Is it true that Americans are very bad at mathematics. If yes, then why?

Speaking as an aspiring mathematician, on a more ideological level, mathematics is not taught how a mathematician would do it. Mathematics is presented in a very dull and boring manner. The purpose of mathematics is not to simply solve problems, but understand them. Something which our education system does not seem to value.This isn't helped by a culture which assumes that it's okay to be bad at math. People say it almost pridefully! Imagine if people said with pride “oh yeah I SUCK at reading!” — somehow it's okay to say this for mathematics.Mathematics is truly about logic and creativity. The current public school system has taken it upon themselves to remove both components from a math education. Most students never get to see the beauty of simple geometry, or the amazing symmetries that elementary group theory models so well.Instead, students are left to solving polynomials, with no clue why they are doing it, and no clue why the method they have been taught works. They are not taught abstraction and reasoning skills. The concept of a proof does not exist. Yet since proofs are central to mathematics, what then are they learning? Equation solving. The mathematical education of the US is equation solving. There's not much else but a brief and disconnected stint in geometry, which entirely never fails to disappoint the student, as it becomes the subject of proving obscure results whose significance is never understood.As the mathematician Edward Frenkel says,“What if at school you had to take an art class in which you were only taught how to paint a fence? What if you were never shown the paintings of Leonardo da Vinci and Picasso? Would that make you appreciate art? Would you want to learn more about it? I doubt it...Of course this sounds ridiculous, but this is how math is taught.”How truly sad our mathematics education is! Our mathematical education “becomes the mental equivalent of watching paint dry.”One mathematician, Paul Lockhart, has given a very detailed criticism of the mathematical education system in the US in a document titled “A Mathematician's Lament” which can be found athttps://www.google.com/url?sa=t&source=web&rct=j&url=https://www.maa.org/external_archive/devlin/LockhartsLament.pdf&ved=2ahUKEwjfqoik59TbAhWpuVkKHeRPD1UQFjAOegQIABAB&usg=AOvVaw1-8EksPe3CjESJf4g-GoVETowards the end, he gives a paraphrase of the US math education. I will include it here, as it is quite clear:“LOWER SCHOOL MATH. The indoctrination begins. Students learn that mathematics is not something you do, but something that is done to you. Emphasis is placed on sitting still, filling out worksheets, and following directions. Children are expected to master a complex set of algorithms for manipulating Hindi symbols, unrelated to any real desire or curiosity on their part, and regarded only a few centuries ago as too difficult for the average adult. Multiplication tables are stressed, as are parents, teachers, and the kids themselves.MIDDLE SCHOOL MATH. Students are taught to view mathematics as a set of procedures, akin to religious rites, which are eternal and set in stone. The holy tablets, or “Math Books,” are handed out, and the students learn to address the church elders as “they” (as in “What do they want here? Do they want me to divide?”) Contrived and artificial “word problems” will be introduced in order to make the mindless drudgery of arithmetic seem enjoyable by comparison. Students will be tested on a wide array of unnecessary technical terms, such as ‘whole number’ and ‘proper fraction,’ without the slightest rationale for making such distinctions. Excellent preparation for Algebra I.ALGEBRA I. So as not to waste valuable time thinking about numbers and their patterns, this course instead focuses on symbols and rules for their manipulation. The smooth narrative thread that leads from ancient Mesopotamian tablet problems to the high art of the Renaissance algebraists is discarded in favor of a disturbingly fractured, post-modern retelling with no characters, plot, or theme. The insistence that all numbers and expressions be put into various standard forms will provide additional confusion as to the meaning of identity and equality. Students must also memorize the quadratic formula for some reason.GEOMETRY. Isolated from the rest of the curriculum, this course will raise the hopes of students who wish to engage in meaningful mathematical activity, and then dash them. Clumsy and distracting notation will be introduced, and no pains will be spared to make the simple seem complicated. This goal of this course is to eradicate any last remaining vestiges of natural mathematical intuition, in preparation for Algebra II.ALGEBRA II. The subject of this course is the unmotivated and inappropriate use of coordinate geometry. Conic sections are introduced in a coordinate framework so as to avoid the aesthetic simplicity of cones and their sections. Students will learn to rewrite quadratic forms in a variety of standard formats for no reason whatsoever. Exponential and logarithmic functions are also introduced in Algebra II, despite not being algebraic objects, simply because they have to be stuck in somewhere, apparently. The name of the course is chosen to reinforce the ladder mythology. Why Geometry occurs in between Algebra I and its sequel remains a mystery.TRIGONOMETRY. Two weeks of content are stretched to semester length by masturbatory definitional runarounds. Truly interesting and beautiful phenomena, such as the way the sides of a triangle depend on its angles, will be given the same emphasis as irrelevant abbreviations and obsolete notational conventions, in order to prevent students from forming any clear idea as to what the subject is about. Students will learn such mnemonic devices as “SohCahToa” and “All Students Take Calculus” in lieu of developing a natural intuitive feeling for orientation and symmetry. The measurement of triangles will be discussed without mention of the transcendental nature of the trigonometric functions, or the consequent linguistic and philosophical problems inherent in making such measurements. Calculator required, so as to further blur these issues.PRE-CALCULUS. A senseless bouillabaisse of disconnected topics. Mostly a half-baked attempt to introduce late nineteenth-century analytic methods into settings where they are neither necessary nor helpful. Technical definitions of ‘limits’ and ‘continuity’ are presented in order to obscure the intuitively clear notion of smooth change. As the name suggests, this course prepares the student for Calculus, where the final phase in the systematic obfuscation of any natural ideas related to shape and motion will be completed.CALCULUS. This course will explore the mathematics of motion, and the best ways to bury it under a mountain of unnecessary formalism. Despite being an introduction to both the differential and integral calculus, the simple and profound ideas of Newton and Leibniz will be discarded in favor of the more sophisticated function-based approach developed as a response to various analytic crises which do not really apply in this setting, and which will of course not be mentioned. To be taken again in college, verbatim.***And there you have it. A complete prescription for permanently disabling young minds— a proven cure for curiosity. What have they done to mathematics!”… Sigh…***EDIT***To address some of the remarks with regards to the importance of calculus and unimportance of more mathematicians, Lockhart does address this in his paper:“How many students taking literature classes will one day be writers? That is not why we teach literature, nor why students take it. We teach to enlighten everyone, not to train only the future professionals. In any case, the most valuable skill for a scientist or engineer is being able to think creatively and independently. The last thing anyone needs is to be trained.”—“But don’t we need people to learn those useful consequences of math? Don’t we need accountants and carpenters and such?How many people actually use any of this “practical math” they supposedly learn in school? Do you think carpenters are out there using trigonometry? How many adults remember how to divide fractions, or solve a quadratic equation? Obviously the current practical training program isn’t working, and for good reason: it is excruciatingly boring, and nobody ever uses it anyway. So why do people think it’s so important? I don’t see how it’s doing society any good to have its members walking around with vague memories of algebraic formulas and geometric diagrams, and clear memories of hating them. It might do some good, though, to show them something beautiful and give them an opportunity to enjoy being creative, flexible, open-minded thinkers— the kind of thing a real mathematical education might provide.”—“But don’t you think that if math class were made more like art class that a lot of kids just wouldn’t learn anything?They’re not learning anything now! Better to not have math classes at all than to do what is currently being done. At least some people might have a chance to discover something beautiful on their own.”

Why is Python so popular despite being so slow?

Yes, it can be up to 200x slower than C family. It's not just that its interpreted, since Lua can also be interpreted but is much faster. Mike Pall, the Lua genius, has said that it's because of some decisions about language design, and at least one of the PyPy guys agreed with him.Have tracing JIT compilers won? (read the whole thing).By the way, I should note that although web apps isn't my thing, latency does matter there too. Although even slow sites are relatively fast these days, I notice that we haven't yet hit the point where faster no longer matters. For some older comments from days where the numbers were bigger see here:Marissa Mayer at Web 2.0Google VP Marissa Mayer just spoke at the Web 2.0 Conference and offered tidbits on what Google has learned about speed, the user experience, and user satisfaction.Marissa started with a story about a user test they did. They asked a group of Google searchers how many search results they wanted to see. Users asked for more, more than the ten results Google normally shows. More is more, they said.So, Marissa ran an experiment where Google increased the number of search results to thirty. Traffic and revenue from Google searchers in the experimental group dropped by 20%.Ouch. Why? Why, when users had asked for this, did they seem to hate it?After a bit of looking, Marissa explained that they found an uncontrolled variable. The page with 10 results took .4 seconds to generate. The page with 30 results took .9 seconds.Half a second delay caused a 20% drop in traffic. Half a second delay killed user satisfaction.This conclusion may be surprising -- people notice a half second delay? -- but we had a similar experience at Online Shopping for Electronics, Apparel, Computers, Books, DVDs & more. In A/B tests, we tried delaying the page in increments of 100 milliseconds and found that even very small delays would result in substantial and costly drops in revenue.Being fast really matters. As Marissa said in her talk, "Users really respond to speed."A major problem for the future is datasets keep getting bigger and at a rate much faster than memory bandwidth and latency improves. I was speaking to a hedge fund tech guy at the D language meetup last night about this. His datasets are maybe 10x bigger than 10 years ago, and memory is maybe only 2x as fast. These relative trends show no sign of slowing - so Moore's Law isn't going to bail you out here.. He found that at data sizes for logs of 30 gig Python chokes. He also said that you can prise numpy from his cold dead hands as its very useful for quick prototyping. But, contra Guido, no Python isn't fast enough for many serious people, and this problem will only get worse. It has horrible cache locality, and when the CPU has to wait for memory access due to the data not being in the cache you may have to wait 500 cycles.Locality of referenceThat's maybe something rather valuable - you want productivity and abstraction, but not to have to pay for it. Andrei Alexandrescu may have one answer:http://bitbashing.io/2015/01/26/d-is-like-native-python.htmlThe Case for DProgramming in D for Python ProgrammersIn the past few decades, pursuing instant gratification has paid handsomely in many areas. In a world where the old rules no longer applied and things were changing quickly, you were much better off trying something and correcting course when it didn't work rather than being too thoughtful about it from the beginning.That seems to be beginning to change, and increased complexity is a big part of that. Rewriting performance-sensitive bits of your code in C sounds like you get the best of both worlds. For some applications that may be true. In other cases you may think that you had walked into a trap - so gratifying to have your prototype working quickly, but it may be that before you know it the project is bigger than you imagined, and at that stage it's not so easy to rewrite bits (and as you do that you now have to manage two code bases in different languages and the interface between them, and keep them in sync).Cython with memory views also seems like a great option, until you realize that you can't touch Python objects if you are writing a library (whether for your own use or for others) there is some possibility that you might want to use your code without engaging the GIL (global interpreter lock) - ie in multi-threaded mode. So in that situation you may end up depending on external C libraries for some purposes as python is off-limits. And that's fine, but yet more complexity and dependencies.On the other hand, here is how you can call Lua from D (and vice-versa is equally simple). So you get the benefits of native code with productivity and low-cost high-level abstraction but can still use a JITed scripting language if it suits your use case.JakobOvrum/LuaDimport luad.all;void main() {  auto lua = new LuaState;  lua.openLibs();  auto print = lua.get!LuaFunction("print");  print("hello, world!"); } Here’s how you write an Excel function in D that can be called directly as a worksheet function (I wrote the library with my colleagues helping):D Programming Language Discussion Forumimport xlld;  @Register(ArgumentText("Array to add"),  HelpTopic("Adds all cells in an array"),  FunctionHelp("Adds all cells in an array"),  ArgumentHelp(["The array to add"]))  double FuncAddEverything(double[][] args) nothrow @nogc {  import std.algorithm: fold;  import std.math: isNaN;   double ret = 0;  foreach(row; args)  ret += row.fold!((a, b) => b.isNaN ? 0.0 : a + b)(0.0);  return ret; } My point is that it’s a false dichotomy. It’s not fast and painfully unproductive vs slow and productive. You can have both if you have a bit of imagination and are prepared and able to make decisions based on the relevant factors rather than social proof.What Knuth actually said is a little more nuanced than the soundbite that his words have become (often used in conversation to terminate thought on a topic, when a little time pondering one's particular use case would bear dividends). He was saying don't waste time worrying about little low-level hacks to save a few percent unless you know it's important.; he wasn't talking about big choices like which language (and implementation) you use.There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified. It is often a mistake to make a priori judgments about what parts of a program are really critical, since the universal experience of programmers who have been using measurement tools has been that their intuitive guesses fail.Page on archive.orgConsidering performance as one factor when you pick which framework you will be wedded to isn't premature optimization. It's prudent forethought about the implications because it's easier to take the time to make the right decision today than to change it later.By the way, he also said in the same article (we tend to hear what we want to hear and ignore the rest):In our previous discussion we concluded that premature emphasis on efficiency is a big mistake which may well be the source of most programming complexity and grief. We should ordinarily keep efficiency considerations in the background when we formulate our programs. We need to be subconsciously aware of the data processing tools available to us, but we should strive most of all for a program that is easy to understand and almost sure to work. (Most programs are probably only run once; and I suppose in such cases we needn't be too fussy about even the structure, much less the efficiency, as long as we are happy with the answers.)Python that isn't too clever may be easier to understand than old-school C/C++, but I am not sure that this is always the case when heavy metaprogramming gets involved (and nobody forces you to write old-school C/C++ today). Static typing does bring benefits in terms of correctness and readability, and some very smart people have spoken about this in explaining their choices of less popular frameworks:Caml Trading talk at CMUThere simply isn't an answer that applies to everyone - it depends on your circumstances, and what you are trying to achieve. But I would encourage you in the medium term to consider the possibility that python isn't the only high-level productive language that may be used to solve general purpose sorts of problems. And some of these other languages don't involve this kind of performance penalty, are better on the correctness front, and can interface to any libraries you might wish to use.I've alluded to one already, but there are others. Lua may be too simplistic for some, but it is fast. Facebook use it in their torch machine learning library, and you can run this from an ipython notebook. It's a big world out there - popular solutions get chosen for a reason, but when things change the currently popular solution isn't always the best option for the future.Addendum: a self-proclaimed 'Python fanboy' complained that I did not answer the question in this response. I think I did, although it's true that I would score nul points on the modern A-level style box-ticking approach to scoring exams. Whether that is a negative thing depends on your perspective!Those who are watching closely will also notice that the question I answered is different from the one into which it has been merged. So if you object about that part, take it up with the guy that merged it.Obviously Python is popular because it's gratifying to get quick results (libraries help too), and until recently performance didn't matter much since processor speed and even the modest advances in memory latency and bandwidth leapfrogged for a while our ability to make use of them. So why not 'waste' a few cycles to make the programmer's life easier since you aren't going to be doing much else with them.One can't stop there though, because the future may be different. Sandisk will have a 16 Tb 2.5" SSD next year. It will cost a few thousand bucks, and isn't going to be in the next laptop I buy for sure. But you can see which way the wind is blowing, because when people have large amounts of storage available they will find a way to use it, and memory simply shows no sign of keeping up. They are talking about doubling capacity every year or two. So in 10 odd years thats 7 doublings, which is 128x bigger than present capacity. Yet memory might be only 2x faster. Looks like I'll be able to get Gigabit internet in my area soon enough (whether I'll move house to take advantage of it is yet to be decided). It's a matter of time before that's commonplace, I should think.On top of that, modern SSDs are pretty fast. You can get 2.1 Gb/sec sequential read throughput using an M2 1/2 TB drive that costs less than 300 quid. (That's raw data - possibly even higher throughput if the data is compressed and you can handle the decompression fast enough). Yet it seems like the fastest JSON parser in the world takes 0.3 seconds to parse maybe 200 Meg of data (so 600 Meg/sec). Parsing JSON isn't exactly the most expensive text-processing operation one might want to do. So it doesn't seem like one is limited by IO in this case, necessarily! And that's today. and trends are only going one way.What is the best language to use in those circumstances? How long do you expect your software to last?Addendum 29th October 2016.A paper published in January 2016 by the ACM observes the following. It may not be true for everyone, and may not be true for many for a while yet. But my experience has been that as storage gets bigger, faster, and cheaper, people find a way to use it and the size of useful datasets increase, and I think it truly is a case of William Gibson’s “The future is already here - just unevenly distributed”.Non-volatile StorageFor the entire careers of most practicing computer scientists, a fundamental observation has consistently held true: CPUs are significantly more performant and more expensive than I/O devices. The fact that CPUs can process data at extremely high rates, while simultaneously servicing multiple I/O devices, has had a sweeping impact on the design of both hardware and software for systems of all sizes, for pretty much as long as we've been building them.This assumption, however, is in the process of being completely invalidated.The arrival of high-speed, non-volatile storage devices, typically referred to as Storage Class Memories (SCM), is likely the most significant architectural change that datacenter and software designers will face in the foreseeable future. SCMs are increasingly part of server systems, and they constitute a massive change: the cost of an SCM, at $3-5k, easily exceeds that of a many-core CPU ($1-2k), and the performance of an SCM (hundreds of thousands of I/O operations per second) is such that one or more entire many-core CPUs are required to saturate it.This change has profound effects:1. The age-old assumption that I/O is slow and computation is fast is no longer true:this invalidates decades of design decisions that are deeply embedded in today's systems.2. The relative performance of layers in systems has changed by a factor of a thousand times over a very short time: this requires rapid adaptation throughout the systems software stack.3. Piles of existing enterprise datacenter infrastructure—hardware and software—are about to become useless (or, at least, very inefficient): SCMs require rethinking the compute/storage balance and architecture from the ground up.Addendum: March 2017Intel 3D XPoint drives are now available, although they aren’t cheap. Their I/O performance means it’s increasingly difficult to say that you’re necessarily I/O bound. Emerging storage today has 1,000 times better latency than NAND flash (SSD drives), and is only 10 times worse latency than DRAM. Overnight that means the bottleneck moved away from storage to processors, the bus, the kernel and so on - the entire architecture, but that includes applications and server processes. Guido’s claim that Python is fast enough may still be true for many applications. But not if you are handling decent amounts of data.These new storage technologies won’t change everything overnight. But they’ll get cheaper and more widespread quickly enough. And that will have implications for the future when it comes to making the right decisions about language choices. Because it’s empirically true that what’s possible as regards language implementations depends on awful lot on language design - they are intimately coupled. If you want to make the most of emerging storage technologies, it’s unlikely in my view that Python will in general be the right tool for the job, even if it was a decade back.Some people here say some things that appear to make sense but are simply not right. Python is slow not because it is interpreted, or because the global interpreter lock (GIL) gets in the way of python threads - those things only make it worse. Python is slow because language features that have been there by design make it incredibly difficult to make it fast. You can make a restricted subset fast - there’s no controversy about that. But what I say is also what Mike Pall, the LuaJIT genius has said, and the authors of PyPy agreed with him.Here is what the author of Pyston - the Dropbox attempt to JIT Python (they gave up because it was just too difficult) - has to say about why Python is slow.Why is Python slowThere's been some discussion over on Hacker News, and the discussion turned to a commonly mentioned question: if LuaJIT can have a fast interpreter, why can't we use their ideas and make Python fast? This is related to a number of other questions, such as "why can't Python be as fast as JavaScript or Lua", or "why don't you just run Python on a preexisting VM such as the JVM or the CLR". Since these questions are pretty common I thought I'd try to write a blog post about it.The fundamental issue is:Python spends almost all of its time in the C runtimeThis means that it doesn't really matter how quickly you execute the "Python" part of Python. Another way of saying this is that Python opcodes are very complex, and the cost of executing them dwarfs the cost of dispatching them. Another analogy I give is that executing Python is more similar to rendering HTML than it is to executing JS -- it's more of a description of what the runtime should do rather than an explicit step-by-step account of how to do it.Pyston's performance improvements come from speeding up the C code, not the Python code. When people say "why doesn't Pyston use [insert favorite JIT technique here]", my question is whether that technique would help speed up C code. I think this is the most fundamental misconception about Python performance: we spend our energy trying to JIT C code, not Python code. This is also why I am not very interested in running Python on pre-existing VMs, since that will only exacerbate the problem in order to fix something that isn't really broken.I think another thing to consider is that a lot of people have invested a lot of time into reducing Python interpretation overhead. If it really was as simple as "just porting LuaJIT to Python", we would have done that by now.I gave a talk on this recently, and you can find the slides here and a LWN writeup here (no video, unfortunately). In the talk I gave some evidence for my argument that interpretation overhead is quite small, and some motivating examples of C-runtime slowness (such as a slow for loop that doesn't involve any Python bytecodes).One of the questions from the audience was "are there actually any people that think that Python performance is about interpreter overhead?". They seem to not read HN :)Update: why is the Python C runtime slow?Here's the example I gave in my talk illustrating the slowness of the C runtime. This is a for loop written in Python, but that doesn't execute any Python bytecodes:import itertoolssum(itertools.repeat(1.0, 100000000)) The amazing thing about this is that if you write the equivalent loop in native JS, V8 can run it 6x faster than CPython. In the talk I mistakenly attributed this to boxing overhead, but Raymond Hettinger kindly pointed out that CPython's sum() has an optimization to avoid boxing when the summands are all floats (or ints). So it's not boxing overhead, and it's not dispatching on tp_as_number->tp_add to figure out how to add the arguments together.My current best explanation is that it's not so much that the C runtime is slow at any given thing it does, but it just has to do a lot. In this itertools example, about 50% of the time is dedicated to catching floating point exceptions. The other 50% is spent figuring out how to iterate the itertools.repeat object, and checking whether the return value is a float or not. All of these checks are fast and well optimized, but they are done every loop iteration so they add up. A back-of-the-envelope calculation says that CPython takes about 30 CPU cycles per iteration of the loop, which is not very many, but is proportionally much more than V8's 5.I thought I'd try to respond to a couple other points that were brought up on HN (always a risky proposition):If JS/Lua can be fast why don't the Python folks get their act together and be fast?Python is a much, much more dynamic language that even JS. Fully talking about that probably would take another blog post, but I would say that the increase in dynamicism from JS->Python is larger than the increase going from Java->JS. I don't know enough about Lua to compare but it sounds closer to JS than to Java or Python.”

View Our Customer Reviews

CocoDoc has made my workflow much more efficient! Moving documents from hard copies to digital made everything come together and save a ton of paper!

Justin Miller