The Law Of Marriage Act, 1971 Part I (A - Yale Law School: Fill & Download for Free

GET FORM

Download the form

How to Edit and draw up The Law Of Marriage Act, 1971 Part I (A - Yale Law School Online

Read the following instructions to use CocoDoc to start editing and finalizing your The Law Of Marriage Act, 1971 Part I (A - Yale Law School:

  • To start with, find the “Get Form” button and tap it.
  • Wait until The Law Of Marriage Act, 1971 Part I (A - Yale Law School is shown.
  • Customize your document by using the toolbar on the top.
  • Download your finished form and share it as you needed.
Get Form

Download the form

The Easiest Editing Tool for Modifying The Law Of Marriage Act, 1971 Part I (A - Yale Law School on Your Way

Open Your The Law Of Marriage Act, 1971 Part I (A - Yale Law School Within Minutes

Get Form

Download the form

How to Edit Your PDF The Law Of Marriage Act, 1971 Part I (A - Yale Law School Online

Editing your form online is quite effortless. You don't have to get any software via your computer or phone to use this feature. CocoDoc offers an easy tool to edit your document directly through any web browser you use. The entire interface is well-organized.

Follow the step-by-step guide below to eidt your PDF files online:

  • Browse CocoDoc official website on your device where you have your file.
  • Seek the ‘Edit PDF Online’ button and tap it.
  • Then you will open this free tool page. Just drag and drop the form, or append the file through the ‘Choose File’ option.
  • Once the document is uploaded, you can edit it using the toolbar as you needed.
  • When the modification is completed, click on the ‘Download’ icon to save the file.

How to Edit The Law Of Marriage Act, 1971 Part I (A - Yale Law School on Windows

Windows is the most conventional operating system. However, Windows does not contain any default application that can directly edit document. In this case, you can get CocoDoc's desktop software for Windows, which can help you to work on documents effectively.

All you have to do is follow the steps below:

  • Install CocoDoc software from your Windows Store.
  • Open the software and then import your PDF document.
  • You can also import the PDF file from Dropbox.
  • After that, edit the document as you needed by using the diverse tools on the top.
  • Once done, you can now save the finished document to your laptop. You can also check more details about how do I edit a PDF.

How to Edit The Law Of Marriage Act, 1971 Part I (A - Yale Law School on Mac

macOS comes with a default feature - Preview, to open PDF files. Although Mac users can view PDF files and even mark text on it, it does not support editing. With the Help of CocoDoc, you can edit your document on Mac without hassle.

Follow the effortless steps below to start editing:

  • At first, install CocoDoc desktop app on your Mac computer.
  • Then, import your PDF file through the app.
  • You can upload the document from any cloud storage, such as Dropbox, Google Drive, or OneDrive.
  • Edit, fill and sign your template by utilizing some online tools.
  • Lastly, download the document to save it on your device.

How to Edit PDF The Law Of Marriage Act, 1971 Part I (A - Yale Law School on G Suite

G Suite is a conventional Google's suite of intelligent apps, which is designed to make your workforce more productive and increase collaboration between you and your colleagues. Integrating CocoDoc's PDF document editor with G Suite can help to accomplish work handily.

Here are the steps to do it:

  • Open Google WorkPlace Marketplace on your laptop.
  • Look for CocoDoc PDF Editor and install the add-on.
  • Upload the document that you want to edit and find CocoDoc PDF Editor by clicking "Open with" in Drive.
  • Edit and sign your template using the toolbar.
  • Save the finished PDF file on your device.

PDF Editor FAQ

What are some good pieces of advice that most college students are not ever likely to hear?

If I could, I'd give every college student a copy of the enduring essays "What Are You Going to Do With That?," by William Deresiewicz and "The Case for Breaking Up With Your Parents," by Terry Castle, stay with them until they read both essays completely, and tell them to re-read them on a yearly basis (I do), because I think that's almost all of what they need to hear. Scratch that, I'd pass copies to everyone and anyone:What Are You Going to Do With That?The question my title poses, of course, is the one that is classically aimed at humanities majors. What practical value could there possibly be in studying literature or art or philosophy? So you must be wondering why I'm bothering to raise it here, at Stanford, this renowned citadel of science and technology. What doubt can there be that the world will offer you many opportunities to use your degree?But that's not the question I'm asking. By "do" I don't mean a job, and by "that" I don't mean your major. We are more than our jobs, and education is more than a major. Education is more than college, more even than the totality of your formal schooling, from kindergarten through graduate school. By "What are you going to do," I mean, what kind of life are you going to lead? And by "that," I mean everything in your training, formal and informal, that has brought you to be sitting here today, and everything you're going to be doing for the rest of the time that you're in school.We should start by talking about how you did, in fact, get here. You got here by getting very good at a certain set of skills. Your parents pushed you to excel from the time you were very young. They sent you to good schools, where the encouragement of your teachers and the example of your peers helped push you even harder. Your natural aptitudes were nurtured so that, in addition to excelling in all your subjects, you developed a number of specific interests that you cultivated with particular vigor. You did extracurricular activities, went to afterschool programs, took private lessons. You spent summers doing advanced courses at a local college or attending skill-specific camps and workshops. You worked hard, you paid attention, and you tried your very best. And so you got very good at math, or piano, or lacrosse, or, indeed, several things at once.Now there's nothing wrong with mastering skills, with wanting to do your best and to be the best. What's wrong is what the system leaves out: which is to say, everything else. I don't mean that by choosing to excel in math, say, you are failing to develop your verbal abilities to their fullest extent, or that in addition to focusing on geology, you should also focus on political science, or that while you're learning the piano, you should also be working on the flute. It is the nature of specialization, after all, to be specialized. No, the problem with specialization is that it narrows your attention to the point where all you know about and all you want to know about, and, indeed, all you can know about, is your specialty.The problem with specialization is that it makes you into a specialist. It cuts you off, not only from everything else in the world, but also from everything else in yourself. And of course, as college freshmen, your specialization is only just beginning. In the journey toward the success that you all hope to achieve, you have completed, by getting into Stanford, only the first of many legs. Three more years of college, three or four or five years of law school or medical school or a Ph.D. program, then residencies or postdocs or years as a junior associate. In short, an ever-narrowing funnel of specialization. You go from being a political-science major to being a lawyer to being a corporate attorney to being a corporate attorney focusing on taxation issues in the consumer-products industry. You go from being a biochemistry major to being a doctor to being a cardiologist to being a cardiac surgeon who performs heart-valve replacements.Again, there's nothing wrong with being those things. It's just that, as you get deeper and deeper into the funnel, into the tunnel, it becomes increasingly difficult to remember who you once were. You start to wonder what happened to that person who played piano and lacrosse and sat around with her friends having intense conversations about life and politics and all the things she was learning in her classes. The 19-year-old who could do so many things, and was interested in so many things, has become a 40-year-old who thinks about only one thing. That's why older people are so boring. "Hey, my dad's a smart guy, but all he talks about is money and livers."And there's another problem. Maybe you never really wanted to be a cardiac surgeon in the first place. It just kind of happened. It's easy, the way the system works, to simply go with the flow. I don't mean the work is easy, but the choices are easy. Or rather, the choices sort of make themselves. You go to a place like Stanford because that's what smart kids do. You go to medical school because it's prestigious. You specialize in cardiology because it's lucrative. You do the things that reap the rewards, that make your parents proud, and your teachers pleased, and your friends impressed. From the time you started high school and maybe even junior high, your whole goal was to get into the best college you could, and so now you naturally think about your life in terms of "getting into" whatever's next. "Getting into" is validation; "getting into" is victory. Stanford, then Johns Hopkins medical school, then a residency at the University of San Francisco, and so forth. Or Michigan Law School, or Goldman Sachs, or Mc­Kinsey, or whatever. You take it one step at a time, and the next step always seems to be inevitable.Or maybe you did always want to be a cardiac surgeon. You dreamed about it from the time you were 10 years old, even though you had no idea what it really meant, and you stayed on course for the entire time you were in school. You refused to be enticed from your path by that great experience you had in AP history, or that trip you took to Costa Rica the summer after your junior year in college, or that terrific feeling you got taking care of kids when you did your rotation in pediatrics during your fourth year in medical school.But either way, either because you went with the flow or because you set your course very early, you wake up one day, maybe 20 years later, and you wonder what happened: how you got there, what it all means. Not what it means in the "big picture," whatever that is, but what it means to you. Why you're doing it, what it's all for. It sounds like a cliché, this "waking up one day," but it's called having a midlife crisis, and it happens to people all the time.There is an alternative, however, and it may be one that hasn't occurred to you. Let me try to explain it by telling you a story about one of your peers, and the alternative that hadn't occurred to her. A couple of years ago, I participated in a panel discussion at Harvard that dealt with some of these same matters, and afterward I was contacted by one of the students who had come to the event, a young woman who was writing her senior thesis about Harvard itself, how it instills in its students what she called self-efficacy, the sense that you can do anything you want. Self-efficacy, or, in more familiar terms, self-esteem. There are some kids, she said, who get an A on a test and say, "I got it because it was easy." And there are other kids, the kind with self-efficacy or self-esteem, who get an A on a test and say, "I got it because I'm smart."Again, there's nothing wrong with thinking that you got an A because you're smart. But what that Harvard student didn't realize—and it was really quite a shock to her when I suggested it—is that there is a third alternative. True self-esteem, I proposed, means not caring whether you get an A in the first place. True self-esteem means recognizing, despite everything that your upbringing has trained you to believe about yourself, that the grades you get—and the awards, and the test scores, and the trophies, and the acceptance letters—are not what defines who you are.She also claimed, this young woman, that Harvard students take their sense of self-efficacy out into the world and become, as she put it, "innovative." But when I asked her what she meant by innovative, the only example she could come up with was "being CEO of a Fortune 500." That's not innovative, I told her, that's just successful, and successful according to a very narrow definition of success. True innovation means using your imagination, exercising the capacity to envision new possibilities.But I'm not here to talk about technological innovation, I'm here to talk about a different kind. It's not about inventing a new machine or a new drug. It's about inventing your own life. Not following a path, but making your own path. The kind of imagination I'm talking about is moral imagination. "Moral" meaning not right or wrong, but having to do with making choices. Moral imagination means the capacity to envision new ways to live your life.It means not just going with the flow. It means not just "getting into" whatever school or program comes next. It means figuring out what you want for yourself, not what your parents want, or your peers want, or your school wants, or your society wants. Originating your own values. Thinking your way toward your own definition of success. Not simply accepting the life that you've been handed. Not simply accepting the choices you've been handed. When you walk into Starbucks, you're offered a choice among a latte and a macchiato and an espresso and a few other things, but you can also make another choice. You can turn around and walk out. When you walk into college, you are offered a choice among law and medicine and investment banking and consulting and a few other things, but again, you can also do something else, something that no one has thought of before.Let me give you another counterexample. I wrote an essay a couple of years ago that touched on some of these same points. I said, among other things, that kids at places like Yale or Stanford tend to play it safe and go for the conventional rewards. And one of the most common criticisms I got went like this: What about Teach for America? Lots of kids from elite colleges go and do TFA after they graduate, so therefore I was wrong. TFA, TFA—I heard that over and over again. And Teach for America is undoubtedly a very good thing. But to cite TFA in response to my argument is precisely to miss the point, and to miss it in a way that actually confirms what I'm saying. The problem with TFA—or rather, the problem with the way that TFA has become incorporated into the system—is that it's just become another thing to get into.In terms of its content, Teach for America is completely different from Goldman Sachs or McKinsey or Harvard Medical School or Berkeley Law, but in terms of its place within the structure of elite expectations, of elite choices, it is exactly the same. It's prestigious, it's hard to get into, it's something that you and your parents can brag about, it looks good on your résumé, and most important, it represents a clearly marked path. You don't have to make it up yourself, you don't have to do anything but apply and do the work­—just like college or law school or McKinsey or whatever. It's the Stanford or Harvard of social engagement. It's another hurdle, another badge. It requires aptitude and diligence, but it does not require a single ounce of moral imagination.Moral imagination is hard, and it's hard in a completely different way than the hard things you're used to doing. And not only that, it's not enough. If you're going to invent your own life, if you're going to be truly autonomous, you also need courage: moral courage. The courage to act on your values in the face of what everyone's going to say and do to try to make you change your mind. Because they're not going to like it. Morally courageous individuals tend to make the people around them very uncomfortable. They don't fit in with everybody else's ideas about the way the world is supposed to work, and still worse, they make them feel insecure about the choices that they themselves have made—or failed to make. People don't mind being in prison as long as no one else is free. But stage a jailbreak, and everybody else freaks out.In A Portrait of the Artist as a Young Man, James Joyce has Stephen Dedalus famously say, about growing up in Ireland in the late 19th century, "When the soul of a man is born in this country there are nets flung at it to hold it back from flight. You talk to me of nationality, language, religion. I shall try to fly by those nets."Today there are other nets. One of those nets is a term that I've heard again and again as I've talked with students about these things. That term is "self-indulgent." "Isn't it self-indulgent to try to live the life of the mind when there are so many other things I could be doing with my degree?" "Wouldn't it be self-indulgent to pursue painting after I graduate instead of getting a real job?"These are the kinds of questions that young people find themselves being asked today if they even think about doing something a little bit different. Even worse, the kinds of questions they are made to feel compelled to ask themselves. Many students have spoken to me, as they navigated their senior years, about the pressure they felt from their peers—from their peers—to justify a creative or intellectual life. You're made to feel like you're crazy: crazy to forsake the sure thing, crazy to think it could work, crazy to imagine that you even have a right to try.Think of what we've come to. It is one of the great testaments to the intellectual—and moral, and spiritual—poverty of American society that it makes its most intelligent young people feel like they're being self-indulgent if they pursue their curiosity. You are all told that you're supposed to go to college, but you're also told that you're being "self-indulgent" if you actually want to get an education. Or even worse, give yourself one. As opposed to what? Going into consulting isn't self-indulgent? Going into finance isn't self-indulgent? Going into law, like most of the people who do, in order to make yourself rich, isn't self-indulgent? It's not OK to play music, or write essays, because what good does that really do anyone, but it is OK to work for a hedge fund. It's selfish to pursue your passion, unless it's also going to make you a lot of money, in which case it's not selfish at all.Do you see how absurd this is? But these are the nets that are flung at you, and this is what I mean by the need for courage. And it's a never-ending proc­ess. At that Harvard event two years ago, one person said, about my assertion that college students needed to keep rethinking the decisions they've made about their lives, "We already made our decisions, back in middle school, when we decided to be the kind of high achievers who get into Harvard." And I thought, who wants to live with the decisions that they made when they were 12? Let me put that another way. Who wants to let a 12-year-old decide what they're going to do for the rest of their lives? Or a 19-year-old, for that matter?All you can decide is what you think now, and you need to be prepared to keep making revisions. Because let me be clear. I'm not trying to persuade you all to become writers or musicians. Being a doctor or a lawyer, a scientist or an engineer or an economist—these are all valid and admirable choices. All I'm saying is that you need to think about it, and think about it hard. All I'm asking is that you make your choices for the right reasons. All I'm urging is that you recognize and embrace your moral freedom.And most of all, don't play it safe. Resist the seductions of the cowardly values our society has come to prize so highly: comfort, convenience, security, predictability, control. These, too, are nets. Above all, resist the fear of failure. Yes, you will make mistakes. But they will be your mistakes, not someone else's. And you will survive them, and you will know yourself better for having made them, and you will be a fuller and a stronger person.It's been said—and I'm not sure I agree with this, but it's an idea that's worth taking seriously—that you guys belong to a "postemotional" generation. That you prefer to avoid messy and turbulent and powerful feelings. But I say, don't shy away from the challenging parts of yourself. Don't deny the desires and curiosities, the doubts and dissatisfactions, the joy and the darkness, that might knock you off the path that you have set for yourself. College is just beginning for you, adulthood is just beginning. Open yourself to the possibilities they represent. The world is much larger than you can imagine right now. Which means, you are much larger than you can imagine.The Case for Breaking Up With Your ParentsShall I be ashamed to kill mother?—Aeschylus, The Libation BearersTime: last year. Place: an undergraduate classroom, in the airy, well-wired precincts of Silicon Valley University. (Oops, I mean Sun-Kissed-Google-Apps-University.) I am avoiding the pedagogical business at hand—the class is my annual survey of 18th-century British literature, and it's as rockin' and rollin' as you might imagine, given the subject—in order to probe my students' reactions to a startling and (to me) disturbing article I have just read in the Harvard alumni magazine. The piece, by Craig Lambert, one of the magazine's editors, is entitled "Nonstop: Today's Superhero Undergraduates Do '3000 Things at 150 Percent.'"As the breaking-newsfeed title suggests, the piece, on the face of it, is anecdotal and seemingly light-hearted—a collegiate Ripley's Believe It or Not! about the overscheduled lives of today's Harvard undergraduates. More than ever before, it would appear, these poised, high-achieving, fantastically disciplined students routinely juggle intense academic studies with what can only seem (at least to an older generation) a truly dizzy-making array of extracurricular activities: pre-professional internships, world-class athletics, social and political advocacy, start-up companies, volunteering for nonprofits, research assistantships, peer advising, musical and dramatic performances, podcasts and video-making, and countless other no doubt virtuous (and résumé-building) pursuits. The pace is so relentless, students say, some plan their packed daily schedules down to the minute—i.e., "shower: 7:15-7:20 a.m."; others confess to getting by on two or three hours of sleep a night. Over the past decade, it seems, the average Harvard undergraduate has morphed into a sort of lean, glossy, turbocharged superhamster: Look in the cage and all you see, where the treadmill should be, is a beautiful blur.I am curious if my Stanford students' lives are likewise chockablock. Heads nod yes; deep sighs are expelled; their own lives are similarly crazy. They can barely keep up, they say—particularly given all the texting and tweeting and cellphoning they have to do from hour to hour too. Do they mind? Not hugely, it would seem. True, they are mildly intrigued by Lambert's suggestion that the "explosion of busyness" is a relatively recent historical phenomenon—and that, over the past 10 or 15 years, uncertain economic conditions, plus a new cultural emphasis on marketing oneself to employers, have led to ever more extracurricular add-ons. Yes, they allow: You do have to display your "well-roundedness" once you graduate. Thus the supersize CV's. You'll need, after all, to advertise a catalog of competencies: your diverse interests, original turn of mind, ability to work alone or in a team, time-management skills, enthusiasm, unflappability—not to mention your moral probity, generosity to those less fortunate, lovable "meet cute" quirkiness, and pleasure in the simple things of life, such as synchronized swimming, competitive dental flossing, and Antarctic exploration. "Yes, it can often be frenetic and with an eye toward résumés," one Harvard assistant dean of students observes, "but learning outside the classroom through extracurricular opportunities is a vital part of the undergraduate experience here."Yet such references to the past—truly a foreign country to my students—ultimately leave them unimpressed. They laugh when I tell them that during my own somewhat damp Jurassic-era undergraduate years—spent at a tiny, obscure, formerly Methodist school in the rainy Pacific Northwest between 1971 and 1975—I never engaged in a single activity that might be described as "extracurricular" in the contemporary sense, not, that is, unless you count the little work-study job I had toiling away evenings in the sleepy campus library. What was I doing all day? Studying and going to class, to be sure. Reading books, listening to music, falling in love (or at least imagining it). Eating ramen noodles with peanut butter. But also, I confess, I did a lot of plain old sitting around—if not outright malingering. I've got a box of musty journals to prove it. After all, nobody even exercised in those days. Nor did polyester exist. Once you'd escaped high school and obligatory PE classes—goodbye hirsute Miss Davis; goodbye, ugly cotton middy blouse and gym shorts—you were done with that. We were all so countercultural back then—especially in the Pacific Northwest, where the early 1970s were still the late sixties. The 1860s.The students now regard me with curiosity and vague apprehension. What planet is she from.But I have another question for them. While Lambert, author of "Nonstop," admires the multitasking undergraduates Harvard attracts, he also worries about the intellectual and emotional costs of such all-consuming busyness. In a turn toward gravitas, he quotes the French film director Jean Renoir's observation that "the foundation of all civilization is loitering" and wonders aloud if "unstructured chunks of time" aren't necessary for creative thinking. And while careful to phrase his concerns ever so delicately—this is the Harvard alumni magazine, after all—he seems afraid that one reason today's students are so driven and compulsive is that they have been trained up to it since babyhood: From preschool on, they are accustomed to their parents pushing them ferociously to make use of every spare minute. Contemporary middle-class parents—often themselves highly accomplished professionals—"groom their children for high achievement," he suspects, "in ways that set in motion the culture of scheduled lives and nonstop activity." He quotes a former Harvard dean of student life:This is the play-date generation. ... There was a time when children came home from school and just played randomly with their friends. Or hung around and got bored, and eventually that would lead you on to something. Kids don't get to do that now. Busy parents book them into things constantly—violin lessons, ballet lessons, swimming teams. The kids get the idea that someone will always be structuring their time for them.The current dean of freshmen concurs: "Starting at an earlier age, students feel that their free time should be taken up with purposeful activities. There is less stumbling on things you love ... and more being steered toward pursuits." Some of my students begin to look downright uneasy; some are now listening hard.Such parental involvement can be distasteful, even queasy-making. "Now," writes Lambert, parents "routinely 'help' with assignments, making teachers wonder whose work they are really grading. ... Once, college applicants typically wrote their own applications, including the essays; today, an army of high-paid consultants, coaches, and editors is available to orchestrate and massage the admissions effort." Nor do such parents give up their busybody ways, apparently, once their offspring lands a prized berth at some desired institute of higher learning. Lambert elaborates:Parental engagement even in the lives of college-age children has expanded in ways that would have seemed bizarre in the recent past. (Some colleges have actually created a "dean of parents" position—whether identified as such or not—to deal with them.) The "helicopter parents" who hover over nearly every choice or action of their offspring have given way to "snowplow parents" who determinedly clear a path for their child and shove aside any obstacle they perceive in the way.•Now, as a professor I have had some experiences with "hel­icopter" parents, and were weather patterns on the West Coast slightly more rigorous, I'm sure I would have encountered "snowplow" parents as well. Indelibly etched on my brain, I tell the class, is a phone call I received one winter break from the aggrieved mother of a student to whom I had given a C-minus in a course that fall. The class had been a graduate course, a Ph.D. seminar, no less. The woman's daughter, a first-year Ph.D. student, had spoken nary a word in class, nor had she ever visited during office hours. Her seminar paper had been unimpressive: Indeed it was one of those for which the epithet "gobsmackingly incoherent" might seem to have been invented. Still, the mother lamented, her daughter was distraught; the poor child had done nothing over the break but cry and brood and wander by herself in the woods. I had ruined everybody's Christmas, apparently, so would I not redeem myself by allowing her daughter to rewrite her seminar paper for a higher grade? It was only fair.While startled to get such a call, I confess to being cowed by this direct maternal assault and, against my academic better judgment, said OK. The student did rewrite the essay, and this time I gave it a B. Generous, I thought. (It was better but still largely incomprehensible.) Yet the ink was hardly dry when the mother called again: Why wasn't her cherished daughter receiving an A? She had rewritten the paper! Surely I realized ... etc. One was forced to feign the gruesome sounds of a fatal choking fit just to get off the phone.Did such hands-on parental advocacy—I inquired—trouble my students? My caller obviously represented an extreme instance, but what did they think about the wider phenomenon? Having internalized images of themselves (if only unconsciously) as standard-bearers of parental ambition—or so Lambert's article had it—their peers at Harvard didn't seem particularly shocked or embarrassed by Ma and Pa's lobbying efforts on their behalf. According to one survey, only 5 to 6 percent of undergrads felt their parents had been "too involved" in the admission process. Once matriculated (there's an interesting word), most students saw frequent parental contact and advice-giving as normal: A third of Harvard undergraduates reported calling or messaging daily with a parent.Yet here it was—just at this delicate punctum—that I found myself reduced (however briefly) to speechlessness. Blindsided. So how often do my students—mostly senior English majors, living in residential dorms—text or talk to their parents? Broad smiles all around. Embarrassed looks at one another. Whispers and some excited giggling. A lot. Well, how much exactly? A lot. But what's a lot? They can't believe I'm asking. Why do I want to know? I might as well be asking them how often they masturbate. And then it all comes tumbling out:Oh, like, every day, sometimes more than once.At least two or three times a day. (Group laughter.)My father e-mails me jokes and stuff every day.My mother would worry if I didn't call her every day. (Nodding heads.)Well, we're always in touch—my parents live nearby so I go home weekends, too.Finally, one student—a delightful young woman whom I know to be smart and levelheaded—confesses that she talks to her mother on the cellphone at least five, maybe six, even seven times a day: We're like best friends, so I call her whenever I get out of class. She wants to know about my professors, what was the exam, so I tell her what's going on and give her, you know, updates. Sometimes my grandmother's there, and I talk to her too.I'm stunned; I'm aghast; I'm going gaga. I must look fairly stricken too—Elektra keening over the corpse of Agamemnon—because now the whole class starts laughing at me, their strange unfathomable lady-professor, the one who doesn't own a television and obviously doesn't have any kids of her own. What a freak. "But when I was in school," I manage finally to gasp, "All we wanted to do was get away from our parents!" "We never called our parents!" "We despised our parents!" "In fact," I splutter—and this is the showstopper—"we only had one telephone in our whole dorm—in the hallway—for 50 people! If your parents called, you'd yell from your room, Tell them I'm not here!"After this last outburst, the students too look aghast. Not to mention morally discomfited. No; these happy, busy, optimistic Stanford undergrads, so beautiful and good in their unisex T-shirts, hoodies, and J.Crew shorts; so smart, scrupulous, forward-looking, well-meaning, well-behaved, and utterly presentable—just the best and the nicest, really—simply cannot imagine the harsh and silent world I'm describing.•At the time, I wasn't sure why this conversation left me dumbfounded, but it did. It stayed with me for weeks, and I told numerous pals about it, marveling again at the bizarreness of contemporary undergraduate life. One said she talked to her mother five times a day! In the moment, the exchange had awakened in me a fairly dismal psychological sensation I'd sometimes felt in classes before (one hard to acknowledge, so out of step with official norms does it seem): namely, that teaching makes me feel lonely. Not all the time, but enough to notice. Lecturing before students, I will suddenly feel utterly bereft. A cloud goes over the sun. Though putatively in charge, I'm estranged from my charges—self-conscious, alone, in a tunnel, the object of attention (and somehow responsible for everything taking place) but unable to speak a language anyone understands. I feel sad and oppressed, smothered almost, slightly panicky. It's a sensation one might have in an anxiety dream—the sort in which you feel abandoned and overwhelmed and without something you desperately need. They've gone away and left me in charge of everything. At least in my own head, it's the sensation of orphanhood.One rallies, of course. Professor Freakout soldiers on and the feeling dissipates. The business of the day returns. But the psychological cloud can remain for a while, like a miasma. By asking my students a lot of intrusive and impertinent questions, I concluded afterward, I'd obviously brought this grisly mood on myself. Their charming, fresh-faced, matter-of-fact responses—yes, they were just as busy as their Harvard counterparts, but, yes, they also managed to stay in (surprisingly) close touch with parents (i.e., they loved and were loved in return)—had somehow triggered my orphan-reflex. I had only myself to blame. I chastised myself for having temporarily forgotten that students today—not just those at Harvard or Stanford, of course—live in a new, exciting, exacting "24/7" world, one utterly unlike (mentalité-wise) the one I inhabited as an undergraduate. They seem reasonably content with their lot; in fact appear to take the endless "connectivity" for granted—the networking, blogging, Skyping, Facebook posts, Twitter feeds. And why shouldn't they? Have they ever known anything else? None of it made me happy, but neither was I particularly happy with myself.Now, lest one wonder, I should say upfront I am not an orphan—or at least not in the official sense. At the time of writing, both my parents are still alive—in their mid-80s, but frail, beginning to fail. They don't live together. In fact, despite residing less than a mile apart, they haven't laid eyes on one another for almost 40 years. Not even by accident in the Rite Aid store. Don't ask. They've had five rancorous marriages between them. I haven't seen my father more than 10 or 12 times over the past decade. That my recurrent sense of psychic estrangement—not to say shock at my students' hooked-in, booked-up, seemingly bountiful lives—might be in some way connected with these Jolly Aged P's is a topic that would no doubt require a posse of shrinks to explore thoroughly. But even without reference to private psychodrama, I think I now at least half-grasp the reason why my students' overscheduled lives, so paradoxically conjoined (I felt) with intense bonds with parents, discombobulated me so thoroughly.Unsurprisingly, orphanhood—that painful thing—has everything to do with the case. Orphanhood conceived, that is, in the broadest sense: as a metaphor for modern human experience, as symbol for unhappy consciousness, as emblem of that groundwork—that inaugural experience of metaphysical solitude—that Martin Heidegger deemed necessary for the act of philosophizing. About orphanhood conceived, in other words, as a condition for world-making—as both the sorrow and creative quintessence of life.Now that's a bit of a mouthful, I realize, so let me explain it in simpler terms. If you teach the history of English and American literature (as I've done most of my life), it's safe to say you will end up, among other things, a state-of-the-art Orphan Expert. Not that it's that hard. You don't need to go back very far in literary history, after all, to find a plethora of orphaned or quasi-orphaned protagonists. At the outset of the play bearing his name, Hamlet, poor mite, might best be understood, after all, as a sort of half-orphan—indeed, a half-orphan with an unconscious wish to become a full-service orphan. If not downright matricidal, he seems aggrieved enough by his mother's perceived betrayals to wonder if hastening her demise might not make life at Elsinore Castle rather more enjoyable for everybody concerned.And what is Milton's Paradise Lost if not one of Western culture's great parables of self-orphaning? Along with the Oresteia and the Oedipus plays, it's a sort of poetical primer on how to forfeit the love and care of one's Creator in a few outrageous, easy-to-follow steps. Satan's not really to blame for the mess: He's just a figment, the kid who sticks chewing gum on the table leg. Adam and Eve know perfectly well what they are doing when they eat the fruit of the Tree of the Knowledge of Good and Evil. They want to eat it. And when they are seen, misery-ridden, leaving life in the Garden behind ("They, hand in hand, with wand'ring steps and slow,/ Through Eden took their solitary way"), they carry with them all the pathos of suddenly abandoned children. They have no mother, presumably, and their Father is dead to them. Worse yet, they are wise orphans; they recognize their own culpability in their loss. Cosmically amplifying their sorrow is the sickening, banal, no-way-back knowledge that they've brought their banishment on themselves. Daddy took the T-Bird away. But we should never have been driving it in the first place.Yet for English speakers, it's in classic Anglo-American fiction—in the novel, say, from Daniel Defoe, Aphra Behn, Samuel Richardson, and Henry Fielding to Dickens, Eliot, Twain, James, Woolf, Hemingway, and the rest—that the orphaned, or semi-orphaned, hero or heroine becomes a central, if not inescapable, fixture. Something about the new social and psychic world in which the realistic novel comes into being in the late 17th and early 18th centuries pushes the orphan to the foreground of the mix, makes of him or her a strikingly necessary figure, a kind of exemplary being. (By "orphan" I likewise include those characters—call them "pseudo-orphans"—who believe themselves to be orphans, but over the course of the narrative discover a mother or father or both.) So memorably have these "one of a kind" characters been drawn, we often know them by a single name or nickname: Moll, Tom, Fanny, Becky, Heathcliff, Jane, Pip, Oliver, Ishmael, Huck, Dorothea, Jude, Isabel, Milly, Lily, Lolly, Sula.Even if you haven't read the books in which these invented beings appear, you've probably heard of them and their stories; may even have a rudimentary sense of what they are like as "people" (self-reliant, footloose, attractive, curious, quick-thinking, lucky, tricky, a mischief-maker, the proverbial black sheep ... and so on). Alarmingly enough, orphaned protagonists appear regularly in stories written explicitly for children: Witness Little Goody Two-Shoes, Pollyanna, Heidi, Little Orphan Annie, Kim, Mowgli, Bilbo, Frodo, Anne (of Green Gables), Dorothy (she of Toto and Auntie Em), Peter (as in Pan), Harry (as in Potter). And needless to say, these parentless juveniles are usually the heroes or heroines of the books in which they appear. They may be wounded or fey or uncanny (what do we make of the vacant circles that Little Orphan Annie has for eyes?), yet they are also resilient, charismatic, oddly powerful.•Thus the first of two big lit-crit hypotheses I'll advance here: More than love, sex, courtship, and marriage; more than inheritance, ambition, rivalry, or disgrace; more than hatred, betrayal, revenge, or death, orphanhood—the absence of the parent, the frightening yet galvanizing solitude of the child—may be the defining fixation of the novel as a genre, what one might call its primordial motive or matrix, the conditioning psychic reality out of which the form itself develops.Now, even though I've made a talking point of it, what's important here is not merely the frequency with which orphaned heroes and heroines appear in fiction since the 18th century. Yes, from Ian Watt's The Rise of the Novel onward, the phenomenon has inspired some brilliant commentary. In one of the most profound books on fiction ever written, Adultery in the Novel, Tony Tanner associates the orphan trope with the early novel's tendency toward diagetic instability—its ambiguous, unsettled "ongoingness" and resistance to closure:The novel, in its origin, might almost be said to be a transgressive mode, inasmuch as it seemed to break, or mix, or adulterate the existing genre-expectations of the time. It is not for nothing that many of the protagonists of the early English novels are socially displaced or unplaced figures—orphans, prostitutes, adventurers, etc. They thus represent or incarnate a potentially disruptive or socially unstabilized energy that may threaten, directly or implicitly, the organization of society, whether by the indeterminacy of their origin, the uncertainty of the direction in which they will focus their unbonded energy, or their attitude toward the ties that hold society together and that they may choose to slight or break.Like the Prostitute or Adventurer, the Orphan embodies the new genre's own picaresque "outlaw" dynamism.Precisely because the 18th-century orphan-hero is usually untried, unprotected, disadvantaged (not to mention misinformed or uninformed about his or her parentage), he or she can function as a sort of textual free radical: as plot-catalyst and story-generator—a mixer-upper of things, whose search for a legitimate identity or place in the world of the fiction at once jump-starts the narrative and tends to shunt it away from didacticism and any predictable or programmatic unfolding of events.A flagrant example of such jump-starting occurs in Defoe's Moll Flanders (1722). Here it is precisely the eponymous heroine's putative orphanhood (she knows only that her mother, whom she presumes to be dead, was a thief and gave birth to her in Newgate Prison) that catalyzes, among other scandals, one of the novel's most titillating (if outlandish) episodes: Moll's shocking marriage-by-mistake to her own brother. (Only well into their marriage, after she and her brother have several children, will Moll realize that her chatty mother-in-law, his mother, is also her mother—long ago transported to America, but still alive and flourishing.) Defoe purports to moralize in Moll Flanders—in his Preface he describes his narrative as free of "Lewd Ideas" and "immodest Turns"—a work "from every part of which something may be learned, and some just and religious inference is drawn." Yet bizarrely, through some inscrutable narrative magic, the very mystery in which Moll's birth is shrouded triggers one of the novel's most perverse and sensational incidents. What on earth are we meant to "learn" from it? Don't ever get married, in case your spouse is really your long-lost brother or sister?Yet Moll Flanders also illuminates a perhaps more profound aspect of the orphan narrative: its austere embedding of a certain hard-boiled psychological realism. Even when the hero or heroine recovers a lost parent, that person can shock or mortify. The "orphan mentality" can persist, alas, post-reunion. Thus Moll finds out that, yes, as she's been told, her mother is a raddled old Newgate jailbird, with the livid mark of the branding iron on her hand. Now, for most of us, such a revelation—even barring incestuous ramifications—would be disillusioning, to say the least. Imagine: After years of loneliness, of longing for a tender maternal embrace, you finally, miraculously, locate your birth mother: She turns out to be a convicted felon. A whore. A liar and check-kiter. A crystal-meth addict. No help there; she's way worse off than I am.•Freud famously described the "family romance" as the childhood fantasy that one's parents aren't,in actuality, one's real parents—that one was switched in the cradle, left in a basket on the doorstep, found under a cabbage leaf or the like, and that one's real father and mother are persons of great wealth, beauty, and high station, a king and queen, perhaps, who will someday return to reclaim you and love you in the way you deserve. He thought such fantasies especially likely to develop at the birth of a sibling, when anger at the parents—for introducing a presumably odious rival into the family circle—is at a height. Real parents are disparaged; imagined parents idealized. The scenario in Moll Flanders reads like a sendup of the Freudian romance: almost a spoof on it. It's not simply that the lost-and-found parent turns out to be disappointingly "trashy." She's quite shockingly trashy—sneaky, disingenuous, a terrible old crone with false teeth, sleazier than you even thought possible. But you're stuck with her, it seems, for life, unless you can find a way to write her back out of your story.If one wanted to be fancy, one might dub this familial antiromance the "emotional drama of the post-Enlightenment child." Moll does not cease to be "orphaned" having rediscovered her mother; on the contrary, she abandons her (and the brother-husband), and resumes her solitary adventuring. And while she will re-encounter the brother later—indeed inherit the Virginia plantation he and the mother have established—Moll never sees her mother again. The maternal reappearance alters little or nothing in the heroine's inner world: Psychologically speaking, Moll is as alone at the end of the fiction as she was when she started. She's what you might call a self-orphaner, an orphan by default. Evasive, secretive, deeply intransigent—one of life's permanentorphans.In the broad, even existential, sense of the term I deploy here, orphanhood is not necessarily reducible to orphanhood in the literal sense. At least metaphorically, virtually any character in the early realist novel might be said to be an orphan—including, paradoxically, many of those heroes and heroines who have a living parent (or two), or end up getting one, as Moll Flanders does. A feeling of intractable loneliness—of absolute moral or spiritual estrangement from the group—may be all that it takes. You don't need to have been abandoned by a parent in the conventional sense, in other words, to feel psychically bereft.Indeed, from a certain angle—and thus my second big lit-crit hypothesis—the orphan trope may allegorize a far more disturbing emotional reality in early fiction: a generic insistence on the reactionary (and destructive) nature of parent/child ties. The more one reads, the more one confronts it: Whatever their status in a narrative (alive, dead, absent, present, lost, found), the parental figures in the early English novel are, in toto, so deeply and overwhelmingly flawed—so cruel, lost, ignorant, greedy, compromised, helpless, selfish, morally absent, or tragically oblivious to their children's needs—one would be better off without them. You might as well be an orphan.Julia Kristeva remarks somewhere (my wording may not be exact) that "in every bourgeois family group there is one child who has a soul." And thus we meet them, in novel after novel: not only those who go literally motherless and fatherless, but also the children "with souls" who, for precisely that reason, will be persecuted by their foolish parents or parental stand-ins; ostracized, abused, made to submit to some hellish moral and spiritual reaming-out. Ruthlessly, imperviously, the realistic novels of the 18th and 19th centuries compulsively foreground this "orphaning" of the psyche; shape it into parable, and in so doing (I think) dramatize the painful birth of the modern subject—that radically deracinated being, vital yet alone, who goes undefined by kinship, caste, class, or visible membership in a group.Witness, for example, the predicament of the eponymous heroine at the outset of Samuel Richardson's august and appalling masterwork, Clarissa. (Published in 1748, Clarissa, for those of you who haven't read it, is the greatest novel ever written in any language.) Now although the young and virtuous Clarissa Harlowe has grown up, presumably happily, at Harlowe-Place surrounded by her "friends"—i.e., both of her parents, two siblings, and several uncles—as the novel opens, she's just been "orphaned" in the emotional sense: profoundly, inexplicably, and shatteringly rejected. (Ironically, the word "friend" in the 18th century can not only mean someone outside the family circle whom one likes or loves, but also a member, simply, of one's immediate family circle.) When Clarissa refuses to marry the man of her father's choice, a rich and grasping Gollum-like creature named Solmes (one always imagines him with webbed feet), her "friends" morph abruptly, and nightmarishly, into domestic dungeon-masters. They revile Clarissa and threaten to disown her; they lock her up in her room for days and refuse to see her or read her letters; they forbid her contact with anyone who might help her; her father curses her. As they prepare to marry her off to Solmes "by force," she seems ever more like one of the victim-children in fairy tales, the designated family sacrifice.Now Richardson critics over the past few decades have tended to skate past these terrifying opening scenes in order to concentrate on Clarissa's sufferings later at the hands of Lovelace, the charming sociopath and would-be rescuer who seduces her. Yes, Lovelace's depredations later are spectacular and obscene—he kidnaps her, drugs her, rapes her while she is drugged, and ultimately hounds her to death. Yet even before Lovelace enters the novel (or so I have always felt), Richardson has already saturated the novelistic mise-en-scène with an even more unnerving and absolute kind of horror. "Home" is the primordial horror-show in this novel—a place of dehumanization and soul-murder from which the child, to save herself, must somehow escape. Count the Harlowes, likewise, among the ghastliest fictional parents outside Greek tragedy—all the more so because they speak the language of sentimental bourgeois feeling. Even as they subject their daughter to unspeakable torments, they "love" Clarissa, they say; that is why she must be so brutally forced to obey.Yet one finds these dire mamas and papas everywhere in early fiction—even comic fiction. They are omnipresent in works by Fielding, Smollett, Burney, Horace Walpole, Mary Shelley, and Ann Radcliffe. Even Jane Austen, arguably, offers an indictment of parents as harsh as that in the Gothic fiction of Shelley or Radcliffe. Witness the foolish, manipulative, greedy, or otherwise profoundly unsatisfactory mothers and fathers in Northanger Abbey, Pride and Prejudice, Mansfield Park, Emma, Persuasion. Austen typically veils the inadequacy, even malice, of her fictional parent-figures by festooning them with comic trappings: We laugh at the absurd Mrs. Bennet, the whinging Mr. Woodhouse, even the monstrous Sir Walter Elliot—the vain, pomaded, rank-obsessed father of Anne Elliot, heroine of Persuasion. (Mothers are often long-dead in Austen, and as in many other works by women from the period, the heroine is obliged to live with a cold, oppressive, or dissociated father.)In real life, having any of these narcissistic nongrown-ups for a parent would be a nightmare come true. They induce bewilderment and a sense of genetic incommensurability. How can Emma—brilliant, coruscating, kind—be the child of the dull, mewling, psychotically self-centered Mr. Woodhouse? Austen's heroines, in particular, are often especially changeling-like—sleek, witty, perceptive misfits, who appear oddly unintegrated into whatever (usually reduced) version of the family unit the novelist has devised for them.What to do with the parents who fail us so abysmally? Perhaps the most drastic solution is to imagine a fictional world from which parents have simply been erased—psychically blanked out—absolutely, and long in advance of any narrative unfurling. Charlotte Brontë's books are a terrifying case in point. They project worlds in which estrangement, loss, and silence about the past seem the precondition for narrative itself. Brontë omits the "back story"—or provides only a fatally impoverished one. Neither of her best-known narrators, Jane Eyre and Lucy Snowe, has a living father or mother: Jane's parents have died of typhus; of Lucy's we know nothing at all. Both heroines seem to emerge out of, and continually slip back into, an amorphous, staggering, irrevocable loneliness. One senses in their aphasia about the past some suppressed horror. Reading Lucy's glassy-eyed narrative, in particular, is like listening to someone who's had a head injury, or suffers from post-traumatic amnesia.We quickly learn not to expect any answers; some submerged trauma is itself the given, the starting point. Crucial information will never be forthcoming. For these are orphan-tales, drawing us, ineluctably, into a domain of emptiness and pain. Yes, Jane Eyre and Lucy Snowe may know their own names—first and last both. (Many fictional orphans don't.) But, affectively speaking, everything else has gone blank. The system crashed long ago. Not only have they no parent or guardian to point to, they seem to have no idea—emotionally, spiritually—what words like "mother" and "father" might mean.•So what—you may be wondering—has all this gloomy business to do with my frantic, ambitious, madly multi-tasking students? With helicopter Moms and Dads? With so-called Velcro parents? The ones who keep messaging 24/7? Surely I don't wish to link all the ultra-depressing things one encounters in literature—O, the horror, the horror, etc.—with the banal, addictive, anodyne back-and-forth of contemporary student life? Hello, you have 193 new messages. Checking for software updates. Your start-up disk is almost full. Hey, it's Mom. I was just wondering if you'd had time yet to. ...Or do I?My answer must be both circumspect and speculative. I don't wish, on the one hand, to sound like someone nostalgic for pain—a relic, a loneliness-junkie, a cheerleader for real-world orphanhood, or (when you get right down to it) a proponent of Orestes-style matricide or patricide. (Not usually, anyway.) On the other hand, I can't help but wonder if we haven't lost the thread when it comes to understanding part of what a "higher education" ideally should entail. Pious college officials yammer on about the need for students to develop something they (the officials) call "critical thinking" and thereby gain intellectual autonomy: a foothold on adulthood. But I'm wondering if it isn't time to reaffirm an idea that "critical thinking" begins at home, or better, withhome—which is to say, that each of us at some point needs to think (dispassionately, daringly) about the "homes" from which we emerge and what we really think of them.Do you owe your parents your obedience? Your deference? Your love? Your phone calls? The questions sound harsh because they are. But our Skype-ridden times may require a certain harshness.Some of the primal myths of our culture—as the greatest artists and writers have always intuited—seem to authorize violence, real or emotional, between the human generations. Francisco Goya's sublime and horrific masterpiece, "Saturn Devouring His Son" (ca. 1819-23), depicts a shocking event in Greek mythology—the cannibalistic murder by the primeval Titan god Kronos (Saturn, in the Roman version) of one of his children. Having received a prophecy that he will be overthrown by one of his own offspring, Kronos devours each of his five children at birth. His wife Ops manages to save their sixth child, Zeus, only by hiding him away on Crete and feeding Kronos a stone in swaddling clothes in place of the newborn. Kronos is fooled and later, this same Zeus, father of the new Olympian gods, overthrows his father, as predicted.An image to shock and awe, undoubtedly, but also one of the great paintings made in that period we call the Enlightenment: that revolutionary era (say, roughly, 1660-1820) during which—for better or for worse—Western culture began to shake off some of the more baleful and stultifying aspects of the Judeo-Christian past and reimagine itself as "modern."The central insight of the period? It's so familiar to us, perhaps, that we have lost sight of its momentousness: that individual human beings are endowed with critical faculties and powers of moral discernment, and as a result, have a right, if not the obligation, to challenge oppressive, unjust, and degrading patterns of authority. Over the course of the 18th century and into the 19th, more and more educated men (and a few brave women) felt intellectually empowered enough to criticize previously sacrosanct "received ideas": traditional religious beliefs, established forms of government, accepted modes of social, legal, and economic organization, the conventional dynamics of family life, relations between men and women, adults and children—all those cognitive grids through which we customarily make sense of the world.At its most potent, the critique was severe—world-changing. A host of Enlightenment freethinkers—Voltaire, Diderot, Rousseau, Hume, Mary Wollstonecraft, Adam Smith—articulated it in passionate and various ways: that the venerable cognitive models human beings had mobilized over the centuries to explain "the nature of things" were often nothing more than self-reinforcing and barbaric "superstition." Taken for dogma, these man-made belief systems had produced a host of ills: savage religious and political strife, the commercial exploitation of the many by the few, the enslavement and genocidal killing of masses of people, the degradation of women, children, animals, and the natural world—century upon century, in fact, of unfathomable global suffering.In his iconic essay of 1784, "What is Enlightenment?" Immanuel Kant put it thus:Enlightenment is man's emergence from his self-incurred immaturity. Immaturity is the inability to use one's own understanding without the guidance of another. This immaturity is self-incurred if its cause is not lack of understanding, but lack of resolution and courage to use it without the guidance of another. The motto of enlightenment is therefore: Sapere aude! Have courage to use your own understanding!Not that Kant imagined any cultural enlightenment to be easy or bloodless—especially given the seemingly intractable human proclivity for business as usual:Laziness and cowardice are the reasons why such a large proportion of men, even when nature has long emancipated them from alien guidance, nevertheless gladly remain immature for life. For the same reasons, it is all too easy for others to set themselves up as their guardians. It is so convenient to be immature! If I have a book to have understanding in place of me, a spiritual adviser to have a conscience for me, a doctor to judge my diet for me, and so on, I need not make any efforts at all. I need not think, so long as I can pay; others will soon enough take the tiresome job over for me.I confess: I first read those words over 25 years ago, and they have never ceased to thrill me.I understand the orphan-narratives of literature the same way I do Goya's painting and Kant's exhortation: as imaginative vehicles designed to shock us into "critical thinking" about those Titan figures we call our parents, and the larger psychosocial forces they so often (wittingly or unwittingly) represent. The intimate authority of parents is, after all, the first kind of authority most of us experience; the parental command the first utterance we recognize as that which must be obeyed. Pain and suffering, we soon learn, will result from our disobedience.And soon enough, most of us become adept at shaping our wishes according to a system of superimposed demands. We learn as young children to control the way we eat, drink, and eliminate waste; we learn to clean our own bodies; we learn under what circumstances it is appropriate to yell or scream or cry, and when we must be silent. Later on, "adult" society will impose further, ever more complex demands. Thus we internalize all those second-order codes of behavior associated with the educational, political, religious, and economic domains within which we all attempt to function, with lesser or greater success.Yet might it not be the case that true advances in human culture—the real leaps in collective understanding—typically result from some maverick individual action—some fundamentaldisobedience on the part of the individual subject? Such maverick actions often disturb—precisely because they need to get our attention. We have to be jolted out of complacency. The greatest artists invariably disrupt and disturb in this way. Like many of the novelists I've been describing, Goya gives us a shocking scene of intergenerational violence—but he does so, precisely, I wager, to force us to confront some of the deepest and hardest feelings we have—about parental authority and its rightful scope, about family violence, about the power of the old over the young, about the role of paternalism in society and government, about whether or not, indeed, those people we designate as "fathers" (priests, doctors, political leaders, scientists) or "mothers" (nurturers, apple-pie makers, self-sacrificing soccer moms, iPhone FaceTime partners, Mama Grizzlies, Tiger Mothers) really Know Best, about whether it is incumbent upon us to exert ourselves against them.You don't have to be a professor, I think, to see Goya as a radical naysayer—a human being horrified by a certain bestial and soul-destroying kind of parental authority. The focus in the "Saturn" painting is on paternal despotism; but elsewhere in Goya's oeuvre we find, too, a frightful bevy of murderous mothers—notably in Los Caprichos (1799), a suite of fantasy-engravings depicting monstrous witches, crones, goats, and owls engaged in child-torture of different sorts. The questions Goya raises remain awful and unremitting, more than 200 years later. Is the rule of life eat or be eaten, even if what you consume is your own child? (One of the most terrible things about "Saturn Devouring His Son" is surely the fact that the headless, half-eaten "child" has the proportions not of a newborn infant, but of an adult human being.) Should we resist our creator's authority? When and how and why? Or should we let ourselves be murdered in his name? When and how and why?Such questions lie at the heart of great literature too. What the early novel dramatizes, it seems to me, is nothing less than a radical transformation in human consciousness—the formation of a new idea. For better or worse, the ferocious, liberating notion embedded in the early novel is that parents are there to be fooled and defied (especially in matters of love, sex, and erotic fulfillment); that even the most venerated traditions exist to be broken with; that creative power is rightly vested in the individual rather than groups, in the young rather than the old; that thought is free. The assertion of individual rights ineluctably begins, symbolically and every other way, with the primal rebellion of the child against parent.So where are we today? Are we in the midst of some countertransformation? A rolling back of the Enlightenment parent-child story? Are we returning to an older model of belief—to a more authoritarian and "elder centric" world? The deferential-child model has dominated most of human history, after all. Maybe the extraordinary Enlightenment break with the age-old commandment—honor thy father and thy mother—was temporary, an aberration, a blip on the screen.My own view remains predictably twisty, fraught, and disloyal. Parents, in my opinion, have to be finessed, thought around, even as we love them: They are so colossally wrong about so many important things. And even when they are not, paradoxically, even when they are 100 percent right, the imperative remains the same: To live an "adult" life, a meaningful life, it is necessary, I would argue, to engage in a kind of symbolic self-orphaning. The process will be different for every person. I have my own inspirational cast of characters in this regard, a set of willful, heroic self-orphaners, past and present, whom I continue to revere: Mozart, the musical child prodigy who successfully rebelled against his insanely grasping and narcissistic father (Leopold Moz­art), who for years shopped him around the courts of Europe as a sort of family cash cow; Sigmund Freud, who, by way of unflinching self-analysis, discovered that it was possible to love and hate something or someone at one and the same time (mothers and fathers included) and that such painfully "mixed emotion" was also inescapably human; Virginia Woolf, who in spite of childhood loss, mental illness, and an acute sense of the sex-prejudice she saw everywhere around her, not only forged a life as a great modernist writer, but made her life an incorrigibly honest and vulnerable one.In a journal entry from 1928 collected in A Writer's Diary, Woolf wrote the following (long after his death) about her brilliant, troubled, well-meaning, tyrannical, depressive, enormously distinguished father—Sir Leslie Stephen, model for Mr. Ramsay in To the Lighthouse and one of the great English "men of letters" of the 19th century:Father's birthday. He would have been 96, 96, yes, today; and could have been 96, like other people one had known: but mercifully was not. His life would have entirely ended mine. What would have happened? No writing, no books—inconceivable. ...The sentimental pathology of the American middle-class family—not to mention the mind-warping digitalization of everyday life—usually militates against such ruthless candor. But what the Life of the Orphan teaches—has taught me at least—is that it is indeed the self-conscious abrogation of one's inheritance, the "making strange" of received ideas, the cultivation of a willingness to defy, debunk, or just plain old disappoint one's parents, that is the absolute precondition, now more than ever, for intellectual and emotional freedom.

Which in your view were some great television shows which were eventually made into terrible movies?

Dark Shadows, 1966–1971 and the feature film 2012.My name is Victoria Winters. My journey is beginning. A journey that I hope will open the doors of life to me and link my past with my future. A journey that will bring me to a strange and dark place, to the edge of the sea, high atop Widows' Hill. A house called Collinwood, a world I've never known, with people I've never met. People who tonight are still only shadows in my mind, but who will soon fill the days and nights of my tomorrows.And so began a Gothic Soap Opera that made quite a splash on contemporary pop culture.The show began as the story of Victoria Winters, a young woman raised in a NYC orphanage. She had it better than the other orphans thanks to anonymous money orders sent for her care. The money orders came from Collinsport Maine. Later Victoria or Vicki had a chance to go to college-thanks to a scholarship managed by a law firm from Collinsport Maine. Upon graduation Vicky was offered a job as teacher and governess to a little boy living in Collinsport Maine.Vicki found that the boy was being raised by his older Aunt Elizabeth (30s film star Joan Bennett.) Elizabeth is a widower who for reasons she won’t go into has not left her estate for 19 years. She runs the family’s considerable holdings from her elegant drawing room. Elizabeth herself has a daughter Carolyn, a gorgeous blonde tramp who dates small time hoods and men who want to hurt her family. The boy entrusted to Vicki is deeply disturbed and particularly hates his father. His mother was not in the picture. The boy’s father Roger was a gambler and playboy who seldom was around unless he needed money.Vicki dealt with this strange assortment and found a boyfriend, the mysterious Burke Devlin whom Roger both hates and fears. Burke hates the entire family. Vicky gets support from her best friend Maggie Evans. Maggie was played by Katherine Leigh Scott one of the first Playboy bunnies, and also known for a memorable Star Trek role. She is now a writer who has published dozens of books and articles on Dark Shadows and Playboy.) Maggie is a waitress, and her father Sam (stage legend David Ford) is an artist who has a strange friendship with Roger and has more money than a not so famous artist should have. He flips when Maggie double dates with Vicki and Burke. Sam seems to fear Burke too.The lovely and innocent Vicki was played by actress Alexandra Moltke.Moltke, the daughter of Danish nobility later quit acting to be a wife and mother. She married Philip Isles, an attorney and member of the wealthy Lehman banking family. She later became the mistress of Claus von Bülow, and testified in the sensational trial when Claus was charged with the attempted murder of his wife Sunny who was left in a vegetative state. This media circus was made into an Academy Award Winning film Reversal of Fortune where Moltke-Isles was played by Julie Haggerty.Borrowing occasionally from Wuthering Heights, Dark Shadows was the baby of producer Dan Curtis who later became a TV mogul and film director. He loved the show and was crestfallen at the low ratings and impending cancellation.With nothing to lose Curtis and the writers decided to “have a little fun.” Vicki had been kidnapped by the estate’s deranged caretaker Matt (Thayer David-a prolific film actor.) The storyline was long and not popular. Matt held Vicki in an old abandoned mansion on the property. Deciding to kill her, Matt is menaced by a ghost emerging from an old portrait. He dies.The ratings spiked. So Curtis who really just wanted to focus on his “normal storyline” approved more “fun.” The ghost was given a name, Josette, a distant Collins ancestor by marriage. Troubled David developed an affinity with her. Later David’s mother turns up wanting custody. It appears to be a standard soap opera custody case when we learn that his mother Laura is a phoenix determined to take David with her into the flames. He is saved in the nick of time by Josette.The ratings were now jumping. Curtis decided to go to the next level and introduce a vampire.Elizabeth was at this point dealing with a blackmailer, Irish gangster Jason Maguire (perennial TV actor Dennis Patrick.) Jason was aided by a petty hood and bully Willie (John Karlen.)Willie happens to hear about Barnabas Collins an ancestor of the family who was a Revolutionary War hero, and friend of Thomas Jefferson. Barnabas had been scolded by his Puritannical parents for squandering money on fancy men’s jewelry, rings, stick pins, walking sticks etc. Barnabas had been engaged to Josette but left for England never to return when she died. Willie investigates and realizes that the family history is incorrect and Barnabas actually died at home and was buried in a hidden mausoleum. Believing that Barnabas might have been buried with his jewels Willie opens the casket and a hand shoots out!Later Elizabeth and the family are visited by a cousin from England that they did not know existed. The elegant gentleman explains that his name is Barnabas and is the descendant of the Barnabas who left for England. He says that he is a student of history and asks if he might live in the old abandoned mansion which it happens was where the original Barnabas grew up. He promises to restore it himself. Taken with the elegant gentleman, Elizabeth agrees and everyone notes how much their new friend resembles the portrait of the original Barnabas. In true television fashion the family miss the obvious and never figure it out.Producer Dan Curtis thought that the character would appear for a few weeks, get staked and they’d move on to something else. Boy was he wrong!Barnabas became very popular and the network exercised their option and hired the actor as regular cast member.Played by Jonathan Frid, the actor never intended to be a soap star. Tired after years of hustling for poor paying stage work in New York and London the graduate of the Royal Academy of Dramatic Arts and holder of a masters from the Yale school of drama had intended to quit acting and teach. He often recalled that he almost didn’t answer the phone when his agent called about Dark Shadows and only agreed to appear because he was told that the character would be killed off in a few weeks. Frid saw the role as a quick buck.Barnabas became huge though. The erudite, greying, middle aged, and gay Frid was stunned when he became a sex symbol. In 1969 he received more fan letters than the Beatles. His likeness adorned toys, games, magazine covers, and Frid was sent all over the country opening malls, appearing in parades. He even was asked to attend a party hosted by First Daughter Trish Nixon. (Pictured center.)Dark Shadows began to focus solely on the Supernatural and the audience follows Barnabas through attempts to cure his vampirism and we learn his origins in an extended storyline where Vicki travels through time to the 1790s and meets Barnabas before he was a vampire.In 1795 we meet Angelique, the evil witch who turned Barnabas into a vampire. Thanks to Vicki, Angelique learns Barnabas lives in the present day and follows Vicki when she comes home. Angelique was driven to madness after Barnabas dumped her for Josette in the 1790s.Barnabas himself dealt with a Frankenstein monster type character, Angelique’s warlock master whom she plots against as much as she does Barnabas, and meets some interesting characters including his first real friend Dr. Julia Hoffman (Academy Award nominated actress Grayson Hall.)Angelique, played by Lara Parker became a sex symbol.Later she became a vampire herself.Angelique and Barnabas were sexy and popular vampires years before Twilight or True Blood.The time travel sequence was so popular that the show also went to the 1890s and the 1840s as well as crossing dimensions.Of course there were goofs and plot holes. Props broke, walls sometimes tipped, actors flubbed their lines, etc. Actor Robert Rodan who later became a realtor working with a young Donald Trump once said-”If the stage hand falling into the set kept his pants on, we’d keep rolling.”It’s understandable, soaps were taped at a time when videotape was costly to edit and it was standard practice to ignore bloopers. Of course most soap operas didn’t have the complicated dialog, elaborate costumes, specialized props, or special effects. So fans then and now love to find the flubs.Poor Frid who was more comfortable with Shakespeare than dialog that had to be learned quickly often flubbed. Although he did well with monologs where you can see his eyes looking in the distance. Grayson Hall once commented that she believed half of the reason Barnabas was a sex symbol was his far away haunted looks and she laughed that the audience didn’t know that the nearsighted Frid was desperately looking for the teleprompter!Frid himself once joked to his friends that even as a gay man he thought it was ridiculous that a man would travel through time and other dimensions to avoid someone who looked like Angelique!After a few years however, the stress of the show took its toll on Frid who asked for “another monster to take the pressure off.” The producers found handsome and tall young actor David Selby. (Later he starred in Falcon Crest.)Selby played the murderous ghost Quentin for several months. He had no dialog but a haunting instrumental proceeded his appearance. Called “Quentin’s Theme” the tune hit the Billboard pop charts.Later Quentin’s origins are revealed when Barnabas travels to the 1890s and meets the living Quentin, a curse victim and werewolf!Quentin’s popularity faded and Frid once again became the main star. Although all of the cast was very popular. The actors saw it more as a theatrical repertory company than as a soap opera. The actors often played multiple characters. The cast loved the challenge if not the skimpy paycheck.John Karlen who played 5 different characters on Dark Shadows and went on to win an Emmy for Cagney and Lacey often called the show a better training for actors than any college or acting school.Fondly remembered, the show started the careers of many actors, Kate Jackson, Abe Vigoda, Denise Nickerson, Marsha Mason, Conrad Bain, Dana Elcar among others.Of course all good things have to come to an end. Dark Shadows began to lose it’s novelty. And cashing in on it’s popularity two theatrical films starring the cast from the show were released. The movies were quite graphic and violent and parents who had never seen the show began to worry and forbad their kids to watch. The writers also began to ran out of ideas. Toward the end of the show there were some weak storylines.After an incredible run the show was cancelled.But it didn’t die!The cast reunited for talk shows and Dark Shadows conventions for years. In the 1980s Dark Shadows conventions were bigger than Star Trek conventions.The popularity was noticed, and the show was rebooted in 1991 with an all star cast that included Hollywood legend Jean Simmons and Chariot’s of Fire star Ben Cross. Also cast was sci-fi icon Roy Thinnes and a young Joseph Gordon-Levitt.A notorious flop, the show was quickly cancelled. But in 2004 a pilot for a new series was made.It did not become a series, but was popular online. This spurred Tim Burton and Johnny Depp to make a theatrical version of the show in 2012.The film was well promoted and featured an all star cast. It did reasonably well, but left most fans of the original series cold.The original series was unintentionally campy, the movie not only tried for camp but overdid it. The jokes were flat, and frankly, Depp and Burton’s magic was fading.If the internet is to be believed, a large part of the audience watched the movie because of the cameos by the original cast. Frid who had declined cameos in other revivals appeared briefly in this film along with Scott, Parker and Selby. Seen here with Burton and Depp.Even Depp fans felt that in the 88 year old Frid who appeared only briefly had more presence and power than the young movie star.As of this writing a new series is in development, but it’s safe to say that the original will never be topped.

View Our Customer Reviews

Customer support responded on a given expectation. They consider Customer Experience and not Business Experience. Good Job Guys!

Justin Miller