Book 2 2 Page 1 August 5 2014: Fill & Download for Free

GET FORM

Download the form

A Step-by-Step Guide to Editing The Book 2 2 Page 1 August 5 2014

Below you can get an idea about how to edit and complete a Book 2 2 Page 1 August 5 2014 in detail. Get started now.

  • Push the“Get Form” Button below . Here you would be transferred into a dashboard making it possible for you to make edits on the document.
  • Select a tool you want from the toolbar that shows up in the dashboard.
  • After editing, double check and press the button Download.
  • Don't hesistate to contact us via [email protected] if you need some help.
Get Form

Download the form

The Most Powerful Tool to Edit and Complete The Book 2 2 Page 1 August 5 2014

Modify Your Book 2 2 Page 1 August 5 2014 Right Away

Get Form

Download the form

A Simple Manual to Edit Book 2 2 Page 1 August 5 2014 Online

Are you seeking to edit forms online? CocoDoc can help you with its comprehensive PDF toolset. You can make full use of it simply by opening any web brower. The whole process is easy and quick. Check below to find out

  • go to the PDF Editor Page.
  • Import a document you want to edit by clicking Choose File or simply dragging or dropping.
  • Conduct the desired edits on your document with the toolbar on the top of the dashboard.
  • Download the file once it is finalized .

Steps in Editing Book 2 2 Page 1 August 5 2014 on Windows

It's to find a default application that can help make edits to a PDF document. Yet CocoDoc has come to your rescue. Examine the Manual below to know possible approaches to edit PDF on your Windows system.

  • Begin by obtaining CocoDoc application into your PC.
  • Import your PDF in the dashboard and make modifications on it with the toolbar listed above
  • After double checking, download or save the document.
  • There area also many other methods to edit PDF text, you can get it here

A Step-by-Step Manual in Editing a Book 2 2 Page 1 August 5 2014 on Mac

Thinking about how to edit PDF documents with your Mac? CocoDoc has the perfect solution for you. It makes it possible for you you to edit documents in multiple ways. Get started now

  • Install CocoDoc onto your Mac device or go to the CocoDoc website with a Mac browser.
  • Select PDF document from your Mac device. You can do so by clicking the tab Choose File, or by dropping or dragging. Edit the PDF document in the new dashboard which encampasses a full set of PDF tools. Save the content by downloading.

A Complete Manual in Editing Book 2 2 Page 1 August 5 2014 on G Suite

Intergating G Suite with PDF services is marvellous progess in technology, with the power to simplify your PDF editing process, making it faster and more cost-effective. Make use of CocoDoc's G Suite integration now.

Editing PDF on G Suite is as easy as it can be

  • Visit Google WorkPlace Marketplace and find CocoDoc
  • establish the CocoDoc add-on into your Google account. Now you are ready to edit documents.
  • Select a file desired by clicking the tab Choose File and start editing.
  • After making all necessary edits, download it into your device.

PDF Editor FAQ

Who is the biggest chowkidar of India?

Undoubtedly it has to be Dr. Subramanian Swamy. Here’s why !How it all started: Pro Nuclear Crusader[1]With a PhD in Economics from Harvard, at a young age of 24, Dr Swamy was already on his way to become a “boring” Professor for the rest of his life, but a simple challenge from his friends literally changed his life. The challenge was fairly innocent & trivial. Swamy was challenged into learning any foreign language in 1 year. The 24 year old Swamy who was hungry to take on challenges, took it seriously & set a very high bar to himself. He said to himself that he will learn the toughest language in shortest possible time. Upon a quick survey, he found that Mandarin (Chinese language) was perceived by the majority of the linguists as the toughest language to learn and it would require 2-3 years to gain confidence in it. So, Dr Swamy said to himself that he will learn Chinese. Finally, he learnt Chinese in just 3 months!!Learning Chinese language changed Dr Swamy’s life because he then started giving more attention to the political developments in China, and as a result, Harvard started assigning him more challenging projects related to Chinese economy (which nobody else in Harvard could take up due to their deficiency in Chinese language, which Dr Swamy had overcome after learning the language in 3 months).While closely tracking the developments in China during the 1960s, Dr Swamy noticed that the Chinese were very serious about regular upgradation of their military systems. In 1964, when China successfully tested its first nuclear weapon, Dr Swamy was one of the first visionaries in India who realized that India also must have its own nuclear equipment in order to defend itself from any future externalities.That’s when Dr Swamy took up the challenge of carrying out an in-depth research on Nuclear, and put extensive efforts into it for the next 5 years. By 1969, he had come up with a comprehensive analysis on Indian Nuclear Strategies with a detailed roadmap of Why India needs Nuclear and how it can achieve it in next few years.His research paper titled “Systems Analysis of Strategic Defence Needs” which was a part of his detailed analysis in 1969, is still used even today as a reference to develop India-centric Nuclear strategies.He did not stop at just researching & publishing such studies, but even took it up to the level of activism, to convince Congress politicians, who were more leaning towards Gandhian non-violence principles and hence felt India should not invest in Nuclear. While he was a persuader within India, at the same time, to the rest of the world, he had already taken up the responsibility of mediating as a diplomat, trying to explain world leaders that India is going nuclear, only to protect itself but never to harm anybody else.Transformation: Anti Corruption Activist[2]During the 1970s, when the nation was plagued with corruption, and when the youth of the nation erupted against Congress Party (under Indira Gandhi) which took the shape of 1974 Bihar movement (Total Revolution), Dr Swamy decided to take it up the anti-corruption cause, and there has been no turning back ever since.While JP & his team led the Total Revolution through protests on the streets, Dr Swamy took it up in intellectual form. After imposition of Emergency by Indira Gandhi in 1975, Dr Swamy went underground, to carry out his anti-corruption crusade against Congress party & Indira Gandhi in particular.Throughout the Emergency, Dr Swamy launched a series of scathing attacks & criticisms against Indira Gandhi, and spreading awareness among the masses about the rampant corruption of Congress. Such was the intensity of his attacks that Indira Gandhi literally had nightmares & spent several sleepless nights due to him. She even went on to call for investigations against him & alleged him of CIA agent because she could not defend herself against his allegations & evidences.When Indira had enough of Dr Swamy, she tried to frame him by first issuing arrest warrant against him, and then calling for attendance of all Parliamentary members, failing which their seats would be revoked. With this strategy, Indira knew that Swamy would definitely attend if he wanted to save his seat, and had planned to arrest Dr Swamy when he attended the Parliament, but thanks to Swamy’s dramatic escape plan, her strategy backfired. Dr Swamy managed to give attendance & also managed to escape from India through a pre-planned flight which looked like a scene straight form a suspense thriller.Next Step: Hindutva Crusader[3]Under Janata Party Govt rule, as most of the right wing & Hindutva parties (like the Jana Sangh) had come together under one umbrella, Dr Swamy gradually started taking up Hindutva causes. Unlike other leaders who just played rhetoric to exploit religious sentiments people’s emotions just for votes, Dr Swamy actually used legal (and sometimes diplomatic) routes to accomplish Hindutva causes.His most popular achievement during the Janata rule, had been the opening of the sacred Hindu mountain Kailash for Indian pilgrims. This was possible mainly because Dr Swamy, thanks to his friend’s challenge, had gained mastery in Chinese language, and hence he was able to visit China as a diplomat & won their hearts by speaking to them in Chinese. After winning their confidence, he was then easily able to persuade them into opening the sacred mountain, and thus began the regular annual trip called “Kailash Mansarovar Yatra” in which Dr Swamy himself was the very first pilgrim & led the first batch.Eventually, Dr Swamy went on to take up several serious Hindutva causes including the most popular case of Rama Sethu in which he single-handedly fought an arduous legal battle to save the Rama Sethu from being destroyed by Govt of India. Thanks to him, the ancient national monument & heritage of Ramayana is still intact today.Challenge of Economic Reforms: Economic Crusader[4]Way back in the late 1960s Dr Swamy had done in-depth research in Indian economy & presented his analysis in the form of a comprehensive book titled “Indian Economic Planning: An Alternative Approach” in which he had given a detailed blueprint & proposed dismantling of “license raj” to revive entrepreneurship & boost Indian economy. The then PM Indira Gandhi dismissed it & labelled Dr Swamy as “A Santa Claus with unrealistic ideas”, while the Parliamentarians cheered Indira Gandhi & burst into hysterical laughter, mocking Dr Swamy.However, over the next few decades, as India slipped into economic downturn, and after the fall of Rajiv Gandhi Govt & subsequent dissolution of VP Singh Govt, it was finally PM Chandra Shekhar who had realized the need for economic reforms. As destiny would have it, when India desperately needed a strong blueprint for economic reform, Dr Subramanian Swamy was seated at the helm, as Commerce Minister under PM Chandra Shekhar, and was given the full authority to draft the reforms.Dr Swamy eagerly took up the challenge and after months of efforts, he was ready with the final draft for economic reforms with detailed blueprint, which was unfortunately put on hold because the Chandra Shekhar Govt fell due to coalition politics.After fresh elections within the next few months, when PV Narasimha Rao (PVN) became the PM, the first thing he did was to approach Dr Subramanian Swamy and requested him to be part of the economic planning team which was supposed to navigate India through the economic disaster it was about to face. In the interest of the nation, Dr Swamy happily gave away his detailed blueprint to PVN, which was then delivered to the Finance Minister Dr Manmohan Singh who then implemented Dr Swamy’s blueprint, and thus India escaped from a major economic crisis and achieved liberalization.Current Role: Anti Corruption Crusader[5]Petition to strike down anti-defamation laws: In Oct 2014, Swamy filed a petition in Supreme Court praying for declaring Sections 499 and 500 of the Indian Penal Code that deal with criminal defamation as unconstitutional.[27]Complaint against Jayalalithaa: In 1996, Swamy had filed a criminal complaint against Jayalalithaa which led to her prosecution, conviction and sentencing to four years imprisonment by the trial court in 2014.[28]. Later, on May 11, 2015, a special Bench of the Karnataka High Court set aside the trial court order convicting former Tamil Nadu Chief Minister Jayalalitha, who was acquitted of all charges in the disproportionate assets case.[29][30] An Appeal against the High court verdict was filed in Supreme Court.[31] The final verdict of Supreme Court came in February, 2017 that indicted Jayalalitha posthumously and upheld the trial court judgement in toto.[33]Phone tapping allegation: Swamy released a letter alleging that former intelligence chief had asked DoT to tap the phone of many politicians and businessmen in Karnataka [34] when Ramakrishna Hegde, the then Chief Minister, resigned in 1988. [35] Hegde then filed a case against him in 1989 and 1990 [36] [37][38]Hashimpura massacre: In 1987, when Muslim youths were killed under police custody, Swamy spoke against it and sat on a fast for more than a week in Jantar Mantar demanding the institution of an inquiry. [39] After 25 years he started pursuing the case once again in court.[40] Rebecca John, a counsel for the Hashimpura complainants, told Additional Sessions Judge Rakesh Siddhartha who is conducting the trial in the case, that "there is no other motive than politics behind Swamy's plea for further investigation and it would only further delay the trial".[41]Role in exposing 2G spectrum case: In November 2008, Swamy amongst others wrote the first of five letters to Prime Minister Manmohan Singh seeking permission to prosecute A. Raja in regard to 2G spectrum case.[42] After not receiving any response,[43] Swamy decided to file a case on his own in the Supreme Court of India regarding the matter, which then asked the Central Bureau of Investigation to produce a detailed report on it.[44] He further called on the Indian government to re-auction the 2G spectrum without the involvement of Communications Minister Kapil Sibal.[45] On 15 April 2011, he filed a 206-page petition with PM Singh seeking permission to prosecute Sonia Gandhi on charges of corruption. He also raised doubts regarding her acquisition of Indian citizenship. [46] Swamy filed documents in the court to prosecute Minister of Home Affairs P. Chidambaram by including a 15 January 2008 letter written by Chidambaram to Prime Minister Manmohan Singh. Swamy also placed on record the certified copy of the minutes of a meeting between Chidambaram, Raja and the prime minister during the tenure of Raja as the MOC&IT. [47] Since criminal charges were filed against the accused, but no evidence was given by Swamy or the CBI, all the respondents have got bail as of July 2012.Sanction to prosecute telecom minister A. Raja: On 31 January 2012, the Supreme Court of India accepted Swamy's petition against the Prime Minister's Office in the 2G case, saying that all public authorities should give a sanction within three months against any public official if a request is made for prosecution. The Supreme Court said that Swamy had the locus standi to seek sanction from the Prime Minister for the prosecution of A Raja in the 2G case. Sanction by a competent authority for the prosecution of a public servant has to be granted within a time frame, the apex court said. Justice AK Ganguly said that the sanction would be deemed to be granted if competent authority failed to take a decision within four months. Swamy's arguments were that he wrote to the PMO on 29 November 2008, but it was only on 19 March 2010 the PMO replied that the plea made by Swamy was "premature" as investigation was being carried out by the Central Bureau of Investigation (CBI). Raja was arrested by the CBI in the case and got bail on 15 May 2012 after spending nearly 15 months in the Tihar Central Jail [48] On December 21, 2017, the special CBI Court Judge acquitted the accused including A Raja.[49]Petition to strike down "single directive provision": In 1997, Swamy filed a petition in the Supreme Court of India to strike down a provision which barred CBI from investigating corruption charges against officers of the rank of joint secretary and above without prior permission of the Govt of India called as "Dr. Subramanian Swamy Versus Director, Central Bureau of Investigation & Anr." [50] On 6 May 2014, a five-judge constitution bench held the single directive provision as invalid and unconstitutional. The court said that "Protection of prior approval for probing graft charges against officers at level of joint secretary and above has propensity of shielding corruption"[51][52] Experts such as former CBI Director Joginder Singh praised the judgement as "Superb".[53] Incumbent CBI Director Ranjit Sinha welcomed the judgement and said, "now a very heavy responsibility has been cast upon us to ensure that no innocent civil-servant is harassed."[54]Investigation on EVM: Swamy demanded that an independent committee should be formed to check the security and safety of the Electronic Voting Machines (EVM) to avoid any rigging or tampering. He argued that countries like US, Japan, UK, Germany and Netherlands have abandoned EVMs and are using paper-ballot system and demanded that a printed receipt should be given to every voter after casting the vote.[55] [56] His PIL to investigate the working of EVM was dismissed by the Delhi High Court on 17 January 2012. The court refused to give any direction to the Election Commission to bring back paper-ballot system or use of printed receipts. The Commission argued that the use of paper is not feasible due to the huge size of Indian electorate. The court further asked the Election Commission to "immediately begin a process of wider consultations" and the Parliament "to go into this question in depth and decide". [57] [58] On 22 January 2013 the Election Commission informed the Supreme Court that it would include Voter Verifiable Paper Audit Trail (VVPAT) system which is in the testing phase after the court agreed with some points raised by Swamy who was the contender, [59] in the machines so that every voter will come to know who he/she is voting by getting a printed slip after pressing the EVM button.[60] [61] The voter paper audit trail has then been in use from 4 September 2013.[62][63]On 8 October 2013 the Supreme Court directed the Election Commission to implement audit trail system in 2014 general election in phases.[64]National Herald case: On 1 November 2012 Swamy alleged that both Sonia and Rahul Gandhi have committed fraud and land grabbing to a tune of ₹20 billion (US$290 million) by acquiring a public ltd company called Associated Journals Private Ltd (AJPL) through their owned private company, Young Indian [65] which was formed on 23 November 2010.[66] Through this they had got publication rights of National Herald and Qaumi Awaz newspapers, with real estate properties in Delhi and Uttar Pradesh.[67] The acquired place was intended only for newspaper purposes but were used for running a passport office, amounting to lakhs of rupees, it alleges. Swamy further added that Rahul Gandhi hid the facts in his affidavit while filing nomination for the 2009 Lok Sabha elections [68] [69] It further alleges that on 26 February 2011 AJPL approved the transfer of unsecured loan of ₹900 million (US$13 million) from the All India Congress Committee at zero interest. [70] [71] Swamy argued that it is illegal for any political party to lend the loan as per violation of Section 269T of Income Tax Act 1961.[72] On 2 November, the party responded that the loan was given only for reviving National Herald newspaper with no commercial interest.[73] Swamy decided to approach the Supreme Court for de-recognising the Congress party, while the Election Commission ordered the probe on 17 November 2012.[74] [75] The hearing of the case had been taken up thereafter on different occasions [76] [77] [78] [79] [80] with the court observing prima facie evidence against all the accused. [78] [81] [82] On 1 August 2014 the Enforcement Directorate initiated probe to find any money laundering in the case [83] while on the same day Swamy was served notice by the High Court. [84] On 28 August the metropolitan court fixed 9 December for the next hearing of the case, [85] [86] while on 12 January 2015 the judge of the Delhi High Court recused himself from hearing the case stating that schedule of cases has been changed and directed that the petitions be directed before an appropriate bench [87] On 27 January 2015, the Supreme Court asked Swamy to make out a case for the speedy trial in the Delhi High Court since the petition cannot be heard directly. [88] On 18 September 2015 it was reported that the Enforcement Directorate had reopened the investigation. [89] Following it, on 19 December 2015 Patiala House Court granted unconditional bail immediately on the hearing to all the five accused but one.[90][91][92] On 12 July 2016 the Delhi High Court set aside the trial court order of 11 January[93] and 11 March [94] based on plea by Swamy to examine balance sheets of Congress party, AJL and Young Indian from 2010-2013,[95][96][97] and fixed the date of next hearing on 20 August.[98]Swamy’s long legal battle with PC & his corrupt family is reaching its logical conclusion. Chidambaram was the architect and author of all the mega scams in India namely 2G, Coal, Non-Performing Assets (NPA) scam, Aircel Maxis, NDTV, Vasan Eye Care, Saradha Scam, Forex derivates scam[1], Airbus scandal to name a few that rocked the nation.It was Subramanian Swamy in August 2018,who unlocked the clandestine relationship between PC, Ahmed Patel with Congress party’s Karnataka money bag D K Shivakumar and handed over the evidence to investigative agencies leading to the Enforcement Directorate (ED) registering case against DK recently[2]Swamy has vowed to track down all the White Collar criminals to cleanse the politics. After hunting down Sonia Gandhi and Rahul Gandhi in National Herald case, P Chidambaram & family in multiple corruption cases, BS Hooda in HUDA case, DK Shivakumar in Hawala case linked to AP & PC, Swamy is likely to push for the prosecution of the culprits in the court of law.[6]Last but not the least is Rahul Gandhi’s citizenship row.[7] The home ministry has issued a notice to Congress chief Rahul Gandhi, asking him to explain his "factual position" over a complaint filed by BJP parliamentarian Subramanian Swamy that he holds British citizenship. The Congress chief has been given a fortnight to respond to the notice.Reacting, Subramanian Swamy, who has in the past raised questions about his citizenship and qualification wrote on Twitter: “Buddhu has also filed income tax returns in UK during 2004-2006 as a British Citizen as Rahul Gandhi while MP in India!!!!!’ Buddhu citizen likely to be cancelled because he cannot deny his British citizenship now. His tax returns are damning. Ensure BJP Govt returns on May 23 rd”[8]In my opinion he fits the definition of the Biggest Chowkidar.Footnotes[1] Dr Subramanian Swamy: Anti-Corruption Crusader, Hindutva Warrior, Economic Chanakya & Nuclear Strategist - Guruprasad's Portal[2] Dr Subramanian Swamy: Anti-Corruption Crusader, Hindutva Warrior, Economic Chanakya & Nuclear Strategist - Guruprasad's Portal[3] Dr Subramanian Swamy: Anti-Corruption Crusader, Hindutva Warrior, Economic Chanakya & Nuclear Strategist - Guruprasad's Portal[4] Dr Subramanian Swamy: Anti-Corruption Crusader, Hindutva Warrior, Economic Chanakya & Nuclear Strategist - Guruprasad's Portal[5] Subramanian Swamy - Wikipedia[6] Subramanian Swamy’s anti-corruption, legal battles and its effect on South Indian politics - PGurus[7] Govt Notice to Rahul Gandhi After Subramanian Swamy's Complaint Over British Citizenship[8] 'Citizenship likely to be cancelled': Swamy says Rahul Gandhi filed IT returns in UK as Brit citizen while MP in India | Latest News & Updates at DNAIndia.com

What is the most Telugu thing ever?

Tirupati: There is no Hindu family in A.P and Telangana where at least one member has not visited Tirupati during their life time. Every Hindu household in both the states has at least one Venkateswara picture on their walls.Who ever goes to Tirupati , will definitely bring Tirupati Laddoo from there and distribute the prasadam to all neighbours. This laddoo used be very large in size compared to the size of today. One will have a blissful experience when he/she inhales the sweet aroma of ghee, ilaichi and dry fruits in the laddu. People feel happy when some body gives them this prasadam.Maa Telugu talliki malle poo danda: Every telugu person is proud of this song. It is sung as a prayer song during school assembly in almost every school. I am not giving the description here. Some people have already mentioned it. Hari Shekar (హరి శేఖర్) has already provided the lyrics and a beautiful translation of the song.We are proud of our leaders: These Telugu leaders make every telugu person very proud1.Potti Sreeramulu :16 March 1901 – 15 December 1952, We should be proud of being born in the state where he is born.He is the person who gave us our identity as Andhras .Sreeramulu is revered as Amarajeevi ("Immortal Being") in the Andhra region for his self-sacrifice for the Andhra cause. He became famous for undertaking a hunger strike in support of the formation of an Indian state for the Telugu-speaking population of Madras Presidency; he lost his life in the process. In his death procession, people shouted slogans praising his sacrifice, with thousands more joining as the procession reached Mount Road,Madras. His death sparked public rioting and Indian Prime Minister Jawaharlal Nehru declared the intent by the newly liberated nation to form Andhra State three days following the death of Sreeramulu.2.Pingali Venkayya(August 1876 - 4 July 1963) : Not only to the state, Telugu leaders have contributed to the nation.Pingali Venkayya was an Indian freedom fighter and the designer of the flag on which the Indian national flag was based. He was born at Bhatlapenumarru, near Masulipatnam, in what is now the Indian state of Andhra Pradesh.3.Alluri Sitarama Raju: Whenever there is any fancy dress competition in A.P. or Telangana school,there will be at least one student who will be dressed as Alluri Sita Rama Raju. He was an Indian revolutionary involved in the Indian independence movement. After the passing of the 1882 Madras Forest Act, its restrictions on the free movement of tribal peoples in the forest prevented them from engaging in their traditional Podu agricultural system, which involved shifting cultivation. Raju led the Rampa Rebellion of 1922–24, during which a band of tribal leaders and other sympathisers fought against the British Raj, which had passed the law.Our daughter dressed as Alluri Sita Rama Raju in her school days.Our grand son dressed as Alluri Sita Rama Raju 2 years back.4.Tanguturi Prakasam Pantulu (23 August 1872 – 20 May 1957):was an Indian politician and freedom fighter, chief minister of the Madras Presidency, and subsequently became the first chief minister of the new Andhra state, created by the partition of Madras State along linguistic lines. He was also known as Andhra Kesari (Lion of Andhra). The Andhra Pradesh government issued G.O on 10 August 2014 declaring his birth anniversary a state festival. Prakasam district in A.P. is named after him.5.Kandukuri Veeresalingam Pantulu:(16 April 1848 – 27 May 1919) was a social reformer, writer of Andhra Pradesh. He is the Father of renaissance movement in Telugu. He was one of the early social reformers who encouraged women education, remarriage of widows which was not supported by the society during his time and fought against dowry system. His novel Rajasekhara Charitramu is considered to be the first novel in Telugu literature.Food items:1.Upma Pesarattu: Pesarattu is made with wet grinding moong dal. Without upma it is not complete.2.Chakkilaalu: They are made from rice flour. A chakki press is never used. They are prepared by hand. Chakkilalu making is an art and not all women can master this art. It a very healthy food because very little oil is absorbed while frying them in the oil.3.Pootharekulu : The recipe is already given in YouTube.4.Kakinada Kaja: Very crunchy outer shell and very juicy inside. You need to be very careful while eating it, otherwise you will end up spilling sugar syrup all over your clothes.5.Chegodeelu: Made from rice flour. Must eat snack.6. Pela Vodiyalu: Generally eaten along with rice but can be eaten as a snack. Made with rice pops (correct word?).7. Gummidi vodiyalu: Generally eaten along with rice . Made with ground urad dal and ash gourd.8.Sarva Pindi: Made with rice flour adding sasem seeds peanuts etc. This very popular in Telangana. It is healthy and yummy. Good breakfast item.9.Avakaya: There are two versions one is very spicy one and the other sweet one (mostly prepared in Coastal Andhra ). Avakaya is a must with curd rice. It is usually stored in clay jars called ‘aavakaya jaadi’ as shown below.10. Korivi Karam: This is typical Guntur thing made with Red chillies(not dried).11. Mirchi Bajji: Hot and spicy! Usually appears on beach roads, parks etc12. Boorelu: These sweet things are usually prepared on festival days or on some auspicious occasions. They are best combination with pulihora.13. Ariselu: Rice flour is mixed with jaggery and deep fried.14. Bandaru Laddoo: The recipe is very complicated but the taste will be superb.15. Sunnundalu: Made with ground urad dal mixed with powdered sugar with lots of ghee(clarified butter). In olden days it was compulsory for the nursing mother to eat them daily.16. Gongura Pachchadi: However large the meal be, it will not be complete without gongura pachchadi. Made with leafy vegetable called Gongura which is very sour in taste.17. Hyderabad Dum Biryani: It aroma lingers not only in Hyderabad but throughout A.P and Telangana. People coming to Hyderabad will not return without tasting it.Rayalseema Ragi mudda:Telangana Kallu :Handicrafts:1.Lakka Pidathalu: No girl in AP or telangana grows without playing with these toys. These toys are made in Etikoppaka village.Other Etikoppaka Bommalu: The cars in front row are not Etikoppaka toys.Making of Etikoppaka toys: The artisans make these toys using a small lathe. You can see that no Etikoppaka toy will have straight edges. Lac is melted and colors are added to the molten lac and after solidifying the colored lac stics are used on lathe to apply colour to the toys.For detailed description read the article written by Sridevi Datta in Live Mint. Below is the link.Fairy tales, toy stories2.Kondapalli Bommalu: They are made with a type of wood which is very light. The colors applied are purely vegetable colors. They are mostly village scenes. Dasavatara is famous and is kept in every bommala koluvu.3.Nirmal Paintings: These paintings are made in the town Nirmal in Telangana. The colors used are vegetable colors. Though a bit costly people take pride in displaying them in their drawing room.Handlooms: Below mentioned all sarees are worn at least once by any telugu lady.1. Venkatagiri sarees2. Uppada Jari sarees3. Mangalagiri Sarees4. Pochampalli sarees5. Guntur handloom sareesFestivals and culture:1. Sankranti Ratham Muggu: In the other answers, some people have described the Sankaranti festival. So I am not repeating it. Blow is the image of ‘Ratham Muggu’ which is a must during Sankranti days.2. Gobbellu: The are made with cow dung mixed with fine mud and decorated with flowers. They are kept on muggu and girls dance around it singing ‘gobbemma’ songs. You can watch gobbemma songs on YouTube.3. Gangireddula vallu: On ‘Kanuma’(3rd day of sankranti), bulls are decorated with flowers, colored cloths bells etc and they are taken from door to door. When some one offers clothes to Gangireddu he(the bull) salutes by bending his head. Sometimes he even dances. There is a separate community called Gangireddula Vallu.4. Hari dasu: To read about life of a Hari Dasu read the article by Sridevi Datta published in Hansindia.The song of Sankranti5. Burra Katha : There is detailed description about this art form in wikipedia.6. Ashtavadhanam and satavadhanam: There is detailed description about this art form in wikipedia.8. Batakamma Pandaga(festival): There is detailed description about this art form in wikipedia.9. Bonala panduga(festival): There is detailed description about this art form in wikipedia.10. Atlataddi: There is detailed description about this nomu in wikipedia.In olden days this was the only festival where young girls had the opportunity to go out and play chemma chekka,oppula kuppa , or swinging. And for young boys this festival gave opportunity to do mischievous things like throwing palleru kayalu ( a tiny thorny fruit)on the way so that they will stick to girl’s clothes. There would be a mock fight between boys and girls.Palleru kayalu will look like this:Chemma chekka will look like this :Oppula kuppa will look like this.Swinging by girlsGirls will apply gorintaku(mehendi) on their hands as shown below on atla taddi. The big circle in the middle is chanda mama(moon) and small circles around it are chukkalu(stars)11. Ugadi pachchadi: I have not included it in food items but in festivals and culture because it is eaten only on Ugadi (telugu new years day) after offering it as ‘Naivedyam’ to God. It is prepared by mixing six ingradients of six tastes ;Tamarind(sour), salt(salty),jaggery(sweet),neem flowers(bitter),very tender raw mangoes(vagaru or tangy in English),green chilly( karam, spicy in English). Taking this prasadam signifies that we should taste and digest all types of emotions.Sadness – Neem Flowers for theirs bitternessHappiness – Jaggery for sweetnessAnger – Green Chilli for its hot tasteFear – Salt for saltinessDisgust – Tamarind Juice for its sournessSurprise – Unripened Mango for its tang12. Vara Lakshmi Vratam: It is performed in the month of ‘Sravana masam’.A coconut , applied with turmeric will be adorned with jewelry. Puja is performed and 7–9 types of prasadams are made and offered to Devi. In the evening pasupu kumkum (turmeric and red kumkumpowder) are offered to ladies.13. Dasara Puli Veshagallu: During Dasara festival men paint all over their body like a tiger and dance on the streets like this:Dance forms:Kuchipudi dance(drama): There is detailed description about this art form in wikipedia.Bhama Kalapam is one of the popular Kuchipudi dance dramas.Vedantam Satya Narayana Sharma: Below is the picture of him. He was the Guru of Kuchipudi dance. He is performing Bhama kalapam here. That is , acting as Satyabhama in Parijatapaharanam story in Maha Bharata.You can watch Bhama Kalapam in YouTube. Here is the linkNot so proud of these things:1. Telugu TV anchors using 50% English words.Why cannot they use at least 90% Telugu words?Their sentences will be like this:Naaku chinnappatininchi paatalante ishtam so nenu play back singerni ayyanu and naaku dance ante kuda chaala liking.(From my childhood I liked singing so i became a play back singer, and also I like dancing)Why the words ‘so’ , ‘and’ ‘liking’. She could have used the words ‘andukani’,’inkaa’ and ‘ishtam’ respectively.2. Clashes between Hero Abhimana SanghaaluFor every top hero there is an an ‘Abhimana Sangham’(hero worship group), and there will always be a rivalry between two groups.It is there from ANR and NTR(Sr)Then Krishna and Shobhan BabuThen Nagarjuna and BalakrishnaI don’t know after them.3. EAMCET exam craze: Like IIT JEE craze in other parts of India, there is this EAMCET craze in A.P and Telangana.4. Over straining of students in concept schools: The school timings are from 7.30 am to 7.30 pm. Imagine the plight of students.5.Exchange of hefty amounts of money , gold and property as dowry , and being proud of it -this is one of the Telugu things. The dowry range from lakhs of rupees to crores of rupees.6.Most of Telugu heroines are not Telugites. They are either from north or other south indian states. Not exhaustive , but some of them are shown here.1. Ghantasala: All Telugu people are proud of singer Ghantasala. Even today youngsters love his songs.Link for one of his devotional songs:2. Maya Bazar: This picture had been released somewhere around 1957. But even till today this movie is played at least once in a year in some or the other theater. The movie had been colorized 6–7 years ago.Some of the scenes in the movieIn the vedio below Ghatotkacha is disguised as Shasirekha. See Savitri’s superb action.3. Bobbili Veena: Read about it in Wikipedia. Any person learning Veena will be proud to posses Bobbili Veena.4. Bapu Bommalu: Bapu is one of the greatest artist. He is famous for his line drawings. He is also a good movie director.Any girl in A.P with slender body, long hair and big eyes is lovingly called ‘Bapu Bomma’. Bomma meaning picture.Budugu and Seeganapesunamba are cute and famous creations of Bapu and Mullapudi Venkata Ramana(Ramana created them and Bapu gave them an image). Budugu is mischievous like Denis. Budugu is afraid of nobody except Seeganapasoonamba who is his girl friend.Here is the picture of Bapu and Ramana: Mullapudi Ramana is one of the famous telugu writers. He is famous for his humor.5. Muggulu: Every morning the first job of the home maker is to draw muggu in front of front door. Muggu is drawn with lime powder mixed with rice flour. First dots are put in a symmetric order then muggu is drawn either by joining the dots or drawing curved lines around the dots.6. Mogali Rekulu: In entire South India we see that girls and ladies adorn their hair with flowers like malle, sanna jaaji, kanakambaram etc. But only in Telugu states they adorn their hair also with ‘mogali rekulu.(When I was searching for English word for Mogali Rekulu on Google, I came to know that there is actually a Telugu TV serial with that name).Mogali poda(bush) looks like this. There is a saying that there will always be snakes in these bushes.(mogali poda kinda eppudoo paamulu untaayi)Mogali Reku is the ripened leaf of the bush. It has very strong fragrance. It looks like this. It is folded like ribbon bow and adorned on the hair.7. Kalipatnam Rama Rao: He is one of the famous Telugu writers. You may be wondering why I have included his name in “most telugu thing” , when there are many more famous writers in Telugu. I will tell you.He is now 92 years old. Any aspiring Telugu writer would be honoured if Kara Mestaru (that is how he is called lovingly by all Telugu people) reads and reviews his/her story. What makes his name most Telugu thing is his devotion to Telugu story and Katha Nilayam. He has a huge library in Srikakulam which is called Katha Nilayam. The library contains all the printed stories starting from the very first Telugu story published in 1910 (written by Gurajada Appa Rao) till date. He spent all his post retirement benefits, his ancestral property and all the money received from various awards(including Sahitya academy ); in building Katha NIlayam. Many research scholars go to this library to read old stories. Not only printed copies of stories but addresses, phone numbers, photos of all Telugu writers are collected by him. Hundreds of photos of Telugu writers are hung on the walls of this library.8. River Godavari: It is a river, then why to include it in ‘Telugu thing’? OK let me tell you. River Godavari mingles in our daily life. Many of the metaphors are made using Godavari.‘Nindu godaari la’ very sober and composed.‘gala gala godaari la ‘ very dynamic and mischivousMany movies are made on the banks of Godavari (mostly by bapu and Viswanth). There may be a hundred songs on Godavari till now. I am giving two links of one old song and another not so old song.9. Pedda Bala Siksha: If you go to any big book house in Telugu states you are bound to find this book congaing more than one thousand pages. It is called Pedda Bala Siksha. You can call it ‘Encyclopedia of Telugu. It contains names of sixty Telugu years ,twelve Telugu months, days of the week, seasons of the year, meaning of solar and lunar eclipse, morals ,Telugu idioms, science, culture, games, number tables, Telugu Satakalu(100+ verses) and many more. It is suitable for children and adults. Up till 1970s every Telugu house used to have one book. Even today people purchase this book, to teach Telugu to their children who are studying in English medium.10. Kadiyam Nurseries: The news clipping below explains every thingA photo from personal album of Kadiyam nursery:Telugu wedding:1. Jeelakarra Bellam: This is is the main event in Telugu marriages. Jeera and jaggery are mixed and pound to make a coarse paste. Exactly on Muhurtham time this paste is put by bride and groom on each others head.2. Talambralu: This is a fun event in Telugu marriages. The bride and groom por rice on each others head. People around bride and groom cheer them to pour more and more rice to win the race of pouring.There is a saying . In Rama and Sita’s wedding pearls were used as talambralu. When the pearls were in Rama’s hand they looked like sapphires and when they are in Sita’s hands they looked like rubies.3. Nagavalli: After tying mangala sutram there will be a ritual called nagavalli. Here the groom ties black beads threaded by kanyalu(un married girls) around brides neck after which the bride places one foot after other on ‘sanni kallu’(grinding stone) and groom adorns her second toe with silver ring called “mette”.Poola Jada: In a typical Telugu marriage, poola jada is a must. Without poola jada bridal makeup is not complete.Fruit varieties exclusive to Telugu states: I got excited after reading this news.These fruits make us proud:1. Banginapalli mamidi(mango): The king of fruits -- Banginapalli mango is likely to get geographical indication as the Directorate of Horticulture and the State Horticulture University have embarked upon the exercise.2. Nuzividu Rasalu:This is the most popular juice mango from the Andhra Pradesh region of India. It usually comes in 2 sizes- the smaller, more popular is called chinna rasalu. The most sought after Chinna rasalu comes from the Nuzvid region of Andhra. This mango is extremely sweet and juicy when ripe. When it is ripe, it needs to be immediately consumed. Most people extract the juice and freeze it for the times when there is no mango. Frozen rasalu juice is a popular delicacy at celebrations and weddings.Its peak season is from mid-May to mid-June.3. Chakrakeli Arati Pallu(bananas): Tambulam is not complete without this bananaKona Seema Kobbari:(coconut)Telugu nursery rhymes: There are many cute nursery rhymes in Telugu.1.Chitti chilakamma2.Burru pitta burru pitta turrru mannadi:3.chuk chuk rail vastondi:4.Chandamama rave:5.Bujji meka bujji meka:6.Kothi bavaku pellanta:7.Seethamma Vakitlo Sirimalle Chettu:8.Veeri Veeri Gummadi Pandu:9.Chenna Patnam Cheruku Mukka:10.Bujji Papa Bujji Papa :Edit 1: After some suggestions I have included these items Sarvapindi in food items. Telugu heroins in ‘not so proud of’ section.Edit 2: Regarding Tirupati: Chitraketh Kataru has mentioned an interesting thing about place in Telangana, where people are not allowed to go to Tirupati. It is Maldakal in Mahabubnagar district.Actually I am not completely aware of the story but it follows that if villagers of Maldakal visit Tirupati something unusual bad happens. Hence people never dare to go to Tirumala.Edit 3: Encouraged by the number of up votes; I have included some more things in my answer. I am overwhelmed to see that most of the up voters are youngsters. And I am proud that the young generation is so much interested in our culture.Some of the readers have advised to include the food items I have omitted in my answer so that if somebody has no time to read all the answers, they may miss some important things. So I have included as many items as possible.Now I have included the information about Kara Mastaru,Godavari river,Mogali rekulu,poola jada, Dasara puli veshalu, Telugu janapada natyalu and nagavalli which I did not include in my previous version.Edit 4: While writing this answer I kept in mind present day scenario that is why I did not include the names of great Telugu leaders. But now I feel I will not make justice to the answer without mentioning their names. Thank you Abhimanyu for the suggestion. I have also added Hyderabadi Dum Biryani.Edit 5: Added something more about atla taddi. Thank you Dileep Katari (దిలీప్ కటారి) for the suggestion.Edit 6: Added Telugu nursery rhymes and Tanguturi Prakasam pantulu.Edit 7: Added Kandukuri Veeresa Lingam and Peda bala sikshaEdit 8: Added fruits of Telugu states.Edit 9: Added Kadiyam nursery.Images of Etikoppaka, Gangireddulavallu and hari dasu are from personal album.For all the other images , Image source Google.

Who has the world's greatest CV ever?

This is Mr. Chih-Jen LinMy browser got stuck for a while when I tried to paste this CV here.Anyway, so Mr. Chih-Jen Lin is a notable person who has contributed a lot in the field of Machine Learning and related areas. He is a Professor of Computer Science at National Taiwan University, and a leading researcher in machine learning, optimization, and data mining. He is best known for the open source library LIBSVM, an implementation of support vector machines.This is his home page url : Welcome to Chih-Jen Lin's Home PageAnd here comes his 28 pages long simple CV : http://www.csie.ntu.edu.tw/~cjlin/resume.pdfBrace yourself and scroll down !!Chih-Jen Lin• PERSONAL DATA1. Address: Department of Computer Science and Information Engineering, National TaiwanUniversity, Taipei 106, Taiwan2. Phone: (886) 2-33664923, Fax: (886) 2-236281673. E-mail: [email protected]. Homepage: Welcome to Chih-Jen Lin's Home Page• EDUCATION AND CURRENT POSITION:1. Distinguished professor, Department of Computer Science and Information Engineering,National Taiwan University, Taipei 106, Taiwan, 2011–present2. Adjunct distinguished professor, Graduate Institute of Networking and Multimedia, NationalTaiwan University, Taipei 106, Taiwan, August 2011–present3. Adjunct distinguished professor, Graduate Institute of Industrial Engineering, NationalTaiwan University, Taipei 106, Taiwan, August 2011–present4. Ph.D., Industrial & Operations Engineering, University of Michigan, September 1995 –May 1998.5. M.S.E., Industrial & Operations Engineering, University of Michigan, September 1995–December 1996.6. B.S., Mathematics, National Taiwan University, October 1989–June 1993.• RESEARCH INTERESTS:1. Machine learning: support vector machines, large-scale data classification, and applicationsWe develop popular machine learning software including LIBSVM (http://www.csie.http://ntu.edu.tw/~cjlin/libsvm) and LIBLINEAR (Welcome to Chih-Jen Lin's Home Pageliblinear).According to Most Cited Articles in Computer Science, LIBSVM is amongthe 10 most cited computer science works at all time.2. Large-scale optimization and its applications• AWARDS AND RECOGNITION:– International:1. ACM fellow, 20152. AAAI fellow, 20143. Best paper award, ACM Recommender Systems 2013 (with students Yong Zhuang,Wei-Sheng Chin, and Yu-Chin Juan)4. ACM Distinguished Scientist, 20115. IEEE fellow (class of 2011) for contributions to support vector machine algorithmsand software.6. Member of the NTU team to win the first prize of KDD cup 2010, 2011 and 2013.7. Best research paper award, ACM KDD 2010 (with students Hsiang-Fu Yu, Cho-JuiHsieh, and Kai-Wei Chang).8. Supervising students Chia-Hua Ho and Ming-Hen Tsai to win the 2nd place of ActiveLearning Challenge 2010.Active learning - Causality Workbench9. Member of the NTU team to win the 3rd place of KDD cup 2009 (extended track) .10. Winner of ICML 2008 large-scale learning challenge (linear SVM track; with studentsHsiang-Fu Yu, Cho-Jui Hsieh, and Kai-Wei Chang).http://largescale.first.fraunhofer.de/summary/11. Supervising student Yin-Wen Chang to win WCCI 2008 Causation and Predictionchallenge.Causality Workbench12. Winner of WCCI 2002 competition on sequence recognition (with master studentsMing-Wei Chang and Bo-Juen Chen)13. Winner the EUNITE 2001 world wide competition (18 research groups) on electricityload prediction (Electricity Load Forecast using Intelligent Adaptive Technology). EUNITE is the EuropeanNetwork of Excellence on Intelligent Technology for smart adaptive systems(with master students Ming-Wei Chang and Bo-Juen Chen).14. Winner of IJCNN Challenge 2001. IJCNN is one of the major Neural Networksconferences (with master student Chih-Chung Chang).15. Winner of the OCR (Optical Character Recognition) competition organized by theUniversity of Essex and the UK Post Office, December 2000. (with master studentChih-Chung Chang)16. Second prize of the student paper competition, Fifth Copper Mountain conferenceon iterative methods, 1998.17. Wallace J. Givens Research Associate (twice): competitive positions in Mathematicsand Computer Science Division of Argonne National Laboratory which are intendedto encourage graduate students who are beginning careers in computational science.– Domestic:1. Outstanding research award of Pan Wen Yuan Foundation, Taiwan, 20162. Pegatron Chair Professorship, 20163. Teco Award, 20154. Macronix International Co. Chair Professorship, 20145. K. T. Li Breakthrough Award, Institute of Information & Computing Machinery,Taiwan, 20126. NTU EECS Academic Excellence Award, NTU College of EECS, 2011.7. Ten outstanding young persons of Taiwan, 20118. Distinguished Scholar Research Project, National Science Council, Taiwan, 2009–2012.9. Outstanding Research Award, National Science Council, Taiwan, 2007, 2010, and2013.10. Ta-You Wu Memorial Award, National Science Council, Taiwan, 2006.11. Fu Ssu-Nien Award of National Taiwan University, 200512. Research award for young researchers from Pan Wen-Yuan Foundation, Taiwan,2003.13. K. T. Li award for young researchers from ACM Taipei/Taiwan chapter, July, 2002(one awarded per year for young computer scientists in Taiwan)14. Young investigator award from Academia Sinica, Taiwan, May, 2002 (15 awardedper year in all research areas)15. Prize for Outstanding Performance, National Mathematics Contest, R.O.C. 1989.• PROFESSIONAL EXPERIENCE:1. Visiting researcher, Microsoft, January 2015 – September 2015, August 2016 – February2017.2. Visiting principal research scientist, eBay Research Labs, January 2012 – September2012.3. Visiting scientist, Google Research, February 2008 – September 2008.4. Visiting scientist, Yahoo! Research, Burbank, California, August 2006 – February 2007.5. Distinguished Professor (August 2011–present), Professor (August 2006–present), AssociateProfessor (August 2002–August 2006), Assistant Professor (August 1998–August2002), Department of Computer Science and Information Engineering, National TaiwanUniversity, Taipei 106, Taiwan.6. Adjunct Associate Professor, Graduate Institute of Networking and Multimedia, NationalTaiwan University, Taipei 106, Taiwan, August 2004– August 20067. Adjunct Associate Professor (August 2002–August 2006), Adjunct Assistant Professor(August 2001–August 2002), Graduate Institute of Industrial Engineering, National TaiwanUniversity, Taipei 106, Taiwan.8. Visiting Scientist, Mathematics and Computer Science division, Argonne National Laboratory,January 1999–February 1999, May 1999–August 1999.9. Research Associate, Mathematics and Computer Science division, Argonne National Laboratory,January 1997–April 1997, September 1997–September 1998.10. Wallace J. Givens Research Associate, Mathematics and Computer Science division,Argonne National Laboratory, May 1996–August 1996 and May 1997–August 1997.11. Research Assistant, Department of Industrial and Operations Engineering, University ofMichigan, September 1995–August 1998.12. Teaching Assistant, Department of Industrial and Operations Engineering, University ofMichigan, September 1996–December 1996.13. Second Lieutenant, R.O.C. Army, July 1993 – May 1995.• JOURNAL PAPERS:[1] Wei-Sheng Chin, Bo-Wen Yuan, Meng-Yuan Yang, Yong Zhuang, Yu-Chin Juan, andChih-Jen Lin. LIBMF: A library for parallel matrix factorization in shared-memorysystems. Journal of Machine Learning Research, 17(86):1–5, 2016. URL https://www.http://csie.ntu.edu.tw/~cjlin/papers/libmf/libmf_open_source.pdf.[2] Wei-Sheng Chin, Yong Zhuang, Yu-Chin Juan, and Chih-Jen Lin. A fast parallelstochastic gradient method for matrix factorization in shared memory systems. ACMTransactions on Intelligent Systems and Technology, 6:2:1–2:24, 2015. URL http://www.csie.ntu.edu.tw/~cjlin/papers/libmf/libmf_journal.pdf.[3] Chien-Chih Wang, Chun-Heng Huang, and Chih-Jen Lin. Subsampled Hessian Newtonmethods for supervised learning. Neural Computation, 27:1766–1795, 2015. URL http://www.csie.ntu.edu.tw/~cjlin/papers/sub_hessian/sample_hessian.pdf.[4] Po-Wei Wang and Chih-Jen Lin. Iteration complexity of feasible descent methods forconvex optimization. Journal of Machine Learning Research, 15:1523–1548, 2014. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/cdlinear.pdf.[5] Ching-Pei Lee and Chih-Jen Lin. Large-scale linear rankSVM. Neural Computation,26(4):781–817, 2014. URL Index of /~cjlin/papers/ranksvmranksvml2.pdf.[6] Ching-Pei Lee and Chih-Jen Lin. A study on L2-loss (squared hinge-loss) multi-classSVM. Neural Computation, 25(5):1302–1323, 2013. URL http://www.csie.ntu.edu.tw/~cjlin/papers/l2mcsvm/l2mcsvm.pdf.[7] Chia-Hua Ho and Chih-Jen Lin. Large-scale linear support vector regression. Journal ofMachine Learning Research, 13:3323–3348, 2012. URL 國立臺灣大學 資訊工程學系~cjlin/papers/linear-svr.pdf.[8] Guo-Xun Yuan, Chia-Hua Ho, and Chih-Jen Lin. An improved GLMNET for l1-regularized logistic regression. Journal of Machine Learning Research, 13:1999–2030,2012. URL http://www.csie.ntu.edu.tw/~cjlin/papers/l1_glmnet/long-glmnet.pdf.[9] Guo-Xun Yuan, Chia-Hua Ho, and Chih-Jen Lin. Recent advances of large-scale linearclassification. Proceedings of the IEEE, 100(9):2584–2603, 2012. URL http://www.csie.http://ntu.edu.tw/~cjlin/papers/survey-linear.pdf.[10] Hsiang-Fu Yu, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. Large linear classificationwhen data cannot fit in memory. ACM Transactions on Knowledge Discoveryfrom Data, 5(4):23:1–23:23, February 2012. URL 國立臺灣大學 資訊工程學系~cjlin/papers/kdd_disk_decomposition.pdf.[11] Chih-Chung Chang and Chih-Jen Lin. LIBSVM: a library for support vector machines.ACM Transactions on Intelligent Systems and Technology, 2(3):27:1–27:27, 2011. Softwareavailable at LIBSVM -- A Library for Support Vector Machines.[12] Wen-Yen Chen, Yangqiu Song, Hongjie Bai, Chih-Jen Lin, and Edward Y. Chang. Parallelspectral clustering in distributed systems. IEEE Transactions on Pattern Analysisand Machine Intelligence, 33(3):568–586, 2011.[13] Ruby C. Weng and Chih-Jen Lin. A Bayesian approximation method for online ranking.Journal of Machine Learning Research, 12:267–300, 2011. URL http://www.csie.ntu.http://edu.tw/~cjlin/papers/online_ranking/online_journal.pdf.[14] Hsiang-Fu Yu, Fang-Lan Huang, and Chih-Jen Lin. Dual coordinate descent methodsfor logistic regression and maximum entropy models. Machine Learning, 85(1-2):41–75,October 2011. URL http://www.csie.ntu.edu.tw/~cjlin/papers/maxent_dual.pdf.[15] Guo-Xun Yuan, Kai-Wei Chang, Cho-Jui Hsieh, and Chih-Jen Lin. A comparison of optimizationmethods and software for large-scale l1-regularized linear classification. Journalof Machine Learning Research, 11:3183–3234, 2010. URL http://www.csie.ntu.edu.tw/~cjlin/papers/l1.pdf.[16] Yin-Wen Chang, Cho-Jui Hsieh, Kai-Wei Chang, Michael Ringgaard, and Chih-Jen Lin.Training and testing low-degree polynomial data mappings via linear SVM. Journal ofMachine Learning Research, 11:1471–1490, 2010. URL 國立臺灣大學 資訊工程學系~cjlin/papers/lowpoly_journal.pdf.[17] Fang-Lan Huang, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. Iterative scalingand coordinate descent methods for maximum entropy. Journal of Machine LearningResearch, 11:815–848, 2010. URL Index of /~cjlin/papersmaxent_journal.pdf.[18] Chih-Jen Lin, Stefano Lucidi, Laura Palagi, Arnaldo Risi, and Marco Sciandrone. Decompositionalgorithm model for singly linearly constrained problems subject to lowerand upper bounds. Journal of Optimization Theory and Applications, 141:107–126, 2009.[19] Tzu-Kuo Huang, Chih-Jen Lin, and Ruby C. Weng. Ranking individuals by groupcomparisons. Journal of Machine Learning Research, 9:2187–2216, 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/genBTexp/genBTexp-jmlr.pdf.[20] Rong-En Fan, Kai-Wei Chang, Cho-Jui Hsieh, Xiang-Rui Wang, and Chih-Jen Lin. LIBLINEAR:a library for large linear classification. Journal of Machine Learning Research,9:1871–1874, 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/liblinear.pdf.[21] Kai-Wei Chang, Cho-Jui Hsieh, and Chih-Jen Lin. Coordinate descent method for largescaleL2-loss linear SVM. Journal of Machine Learning Research, 9:1369–1398, 2008.URL http://www.csie.ntu.edu.tw/~cjlin/papers/cdl2.pdf.[22] Chih-Jen Lin, Ruby C. Weng, and S. Sathiya Keerthi. Trust region Newton method forlarge-scale logistic regression. Journal of Machine Learning Research, 9:627–650, 2008.URL http://www.csie.ntu.edu.tw/~cjlin/papers/logistic.pdf.[23] Hsuan-Tien Lin, Chih-Jen Lin, and Ruby C. Weng. A note on Platt’s probabilisticoutputs for support vector machines. Machine Learning, 68:267–276, 2007. URL http://www.csie.ntu.edu.tw/~cjlin/papers/plattprob.pdf.[24] Chih-Jen Lin. On the convergence of multiplicative update algorithms for non-negativematrix factorization. IEEE Transactions on Neural Networks, 18(6):1589–1596, 2007.URL http://www.csie.ntu.edu.tw/~cjlin/papers/multconv.pdf.[25] Chih-Jen Lin. Projected gradient methods for non-negative matrix factorization. NeuralComputation, 19:2756–2779, 2007. URL Welcome to Chih-Jen Lin's Home Pagepapers/pgradnmf.pdf.[26] Tzu-Kuo Huang, Ruby C. Weng, and Chih-Jen Lin. Generalized Bradley-Terry modelsand multi-class probability estimates. Journal of Machine Learning Research, 7:85–115,2006. URL http://www.csie.ntu.edu.tw/~cjlin/papers/generalBT.pdf.[27] Pai-Hsuen Chen, Rong-En Fan, and Chih-Jen Lin. A study on SMO-type decompositionmethods for support vector machines. IEEE Transactions on Neural Networks, 17:893–908, July 2006. URL http://www.csie.ntu.edu.tw/~cjlin/papers/generalSMO.pdf.[28] Rong-En Fan, Pai-Hsuen Chen, and Chih-Jen Lin. Working set selection using secondorder information for training SVM. Journal of Machine Learning Research, 6:1889–1918,2005. URL http://www.csie.ntu.edu.tw/~cjlin/papers/quadworkset.pdf.[29] Ming-Wei Chang and Chih-Jen Lin. Leave-one-out bounds for support vector regressionmodel selection. Neural Computation, 17(5):1188–1222, 2005.[30] Pai-Hsuen Chen, Chih-Jen Lin, and Bernhard Sch¨olkopf. A tutorial on ν-support vectormachines. Applied Stochastic Models in Business and Industry, 21:111–136, 2005. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/nusvmtoturial.pdf.[31] Ting-Fan Wu, Chih-Jen Lin, and Ruby C. Weng. Probability estimates for multi-classclassification by pairwise coupling. Journal of Machine Learning Research, 5:975–1005,2004. URL http://www.csie.ntu.edu.tw/~cjlin/papers/svmprob/svmprob.pdf.[32] Bo-Juen Chen, Ming-Wei Chang, and Chih-Jen Lin. Load forecasting using supportvector machines: A study on EUNITE competition 2001. IEEE Transactions on PowerSystems, 19(4):1821–1830, November 2004.[33] Wei-Chun Kao, Kai-Min Chung, Chia-Liang Sun, and Chih-Jen Lin. Decompositionmethods for linear support vector machines. Neural Computation, 16(8):1689–1704, 2004.URL http://www.csie.ntu.edu.tw/~cjlin/papers/linear.pdf.[34] Ming-Wei Chang, Chih-Jen Lin, and Ruby C. Weng. Analysis of switching dynamicswith competing support vector machines. IEEE Transactions on Neural Networks, 15(3):720–727, 2004.[35] Chin-Sheng Yu, Chih-Jen Lin, and Jen-Kang Hwang. Predicting subcellular localizationof proteins for Gram-negative bacteria by support vector machines based on n-peptidecompositions. Protein Science, 13:1402–1406, 2004.[36] Kai-Min Chung, Wei-Chun Kao, Chia-Liang Sun, Li-Lun Wang, and Chih-Jen Lin. Radiusmargin bounds for support vector machines with the RBF kernel. Neural Computation,15:2643–2681, 2003.[37] S. Sathiya Keerthi and Chih-Jen Lin. Asymptotic behaviors of support vector machineswith Gaussian kernel. Neural Computation, 15(7):1667–1689, 2003.[38] Kuan-Min Lin and Chih-Jen Lin. A study on reduced support vector machines. IEEETransactions on Neural Networks, 14(6):1449–1559, 2003. URL http://www.csie.ntu.http://edu.tw/~cjlin/papers/rsvmTEX.pdf.[39] Chin-Sheng Yu, Jung-Ying Wang, Jinn-Moon Yang, Ping-Chiang Lyu, Chih-Jen Lin,and Jen-Kang Hwang. Fine-grained protein fold assignment by support vector machinesusing generalize npeptide coding schemes and jury voting from multiple-parameter sets.Proteins, 50:531–536, 2003.[40] Chih-Jen Lin. A formal analysis of stopping criteria of decomposition methods for supportvector machines. IEEE Transactions on Neural Networks, 13(5):1045–1052, 2002. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/stop.ps.gz.[41] Chih-Jen Lin. Asymptotic convergence of an SMO algorithm without any assumptions.IEEE Transactions on Neural Networks, 13(1):248–250, 2002. URL http://www.csie.http://ntu.edu.tw/~cjlin/papers/q2conv.pdf.[42] Chih-Chung Chang and Chih-Jen Lin. Training ν-support vector regression: Theory andalgorithms. Neural Computation, 14(8):1959–1977, 2002.[43] Shuo-Peng Liao, Hsuan-Tien Lin, and Chih-Jen Lin. A note on the decomposition methodsfor support vector regression. Neural Computation, 14:1267–1281, 2002.[44] Chih-Wei Hsu and Chih-Jen Lin. A comparison of methods for multi-class support vectormachines. IEEE Transactions on Neural Networks, 13(2):415–425, 2002.[45] Chih-Wei Hsu and Chih-Jen Lin. A simple decomposition method for support vectormachines. Machine Learning, 46:291–314, 2002.[46] Chih-Jen Lin. On the convergence of the decomposition method for support vectormachines. IEEE Transactions on Neural Networks, 12(6):1288–1298, 2001. URL http://www.csie.ntu.edu.tw/~cjlin/papers/conv.ps.gz.[47] Jinn-Moon Yang, Jorng-Tzong Horng, Chih-Jen Lin, and Cheng-Yan Kao. Optical coatingdesign using the family competition evolutionaty algorithm. Evolutionary Computation,9(4):421–444, 2001.[48] Chih-Chung Chang and Chih-Jen Lin. Training ν-support vector classifiers: Theory andalgorithms. Neural Computation, 13(9):2119–2147, 2001.[49] Chih-Jen Lin. Formulations of support vector machines: a note from an optimizationpoint of view. Neural Computation, 13(2):307–317, 2001.[50] Shu-Cherng Fang, Chih-Jen Lin, and Soon-Yi Wu. Solving quadratic semi-infinite programmingproblems by using relaxed cutting plane scheme. Journal of Computationaland Applied Mathematics, 129:89–104, 2001.[51] Soon-Yi Wu, Shu-Cherng Fang, and Chih-Jen Lin. Solving the general capacity problem.Annals of Operations Research, 103:193–211, 2001.[52] Chih-Chung Chang, Chih-Wei Hsu, and Chih-Jen Lin. The analysis of decompositionmethods for support vector machines. IEEE Transactions on Neural Networks, 11(4):1003–1008, 2000.[53] Chih-Jen Lin and Romesh Saigal. An incomplete Cholesky factorization for dense matrices.BIT, 40:536–558, 2000.[54] Chih-Jen Lin and Jorge J. Mor´e. Newton’s method for large-scale bound constrainedproblems. SIAM Journal on Optimization, 9:1100–1127, 1999.[55] Chih-Jen Lin and Jorge J. Mor´e. Incomplete Cholesky factorizations with limited memory.SIAM J. Sci. Comput., 21:24–45, 1999.[56] Shu-Cherng Fang, Soon-Yi Wu, and Chih-Jen Lin. Relaxed cutting plane method forsolving linear semi-infinite programming problems. Journal of Optimization Theory andApplications, 99:759–779, 1998.[57] Chih-Jen Lin, Soon-Yi Wu, and Shu-Cherng Fang. An unconstrained convex programmingapproach for solving linear semi-infinite programming problems. SIAM Journal onOptimization, 8(2), 1998.[58] Chih-Jen Lin, Soon-Yi Wu, and Shu-Cherng Fang. On the parametric linear semi–infiniteoptimization. Applied Mathematics Letter, 9:89–96, 1996.[59] Chih-Jen Lin, E. K. Yang, Shu-Cherng Fang, and Soon-Yi Wu. Implementation of aninexact approach to solving linear semi-infinite programming problems. Journal of Computationaland Applied Mathematics, 61:87–103, 1995.[60] Shu-Cherng Fang, Chih-Jen Lin, and Soon-Yi Wu. On solving convex quadratic semiinfiniteprogramming problems. Optimization, 37:107–125, 1994.• REFEREED CONFERENCE PAPERSSome papers here are preliminary versions of journal papers.[1] Yuchin Juan, Yong Zhuang, Wei-Sheng Chin, and Chih-Jen Lin. Field-aware factorizationmachines for CTR prediction. In Proceedings of the ACM Recommender SystemsConference (RecSys), 2016.[2] Wei-Lin Chiang, Mu-Chu Lee, and Chih-Jen Lin. Parallel dual coordinate descent methodfor large-scale linear classification in multi-core environments. In Proceedings of the22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining(KDD), 2016. URL http://www.csie.ntu.edu.tw/~cjlin/papers/multicore_cddual.pdf.[3] Hsin-Yuan Huang and Chih-Jen Lin. Linear and kernel classification: When to use which?In Proceedings of SIAM International Conference on Data Mining (SDM), 2016. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/kernel-check/kcheck.pdf.[4] Mu-Chu Lee, Wei-Lin Chiang, and Chih-Jen Lin. Fast matrix-vector multiplications forlarge-scale logistic regression on shared-memory systems. In Proceedings of the IEEEInternational Conference on Data Mining (ICDM), 2015. URL http://www.csie.ntu.http://edu.tw/~cjlin/papers/multicore_liblinear_icdm.pdf.[5] Bo-Yu Chu, Chia-Hua Ho, Cheng-Hao Tsai, Chieh-Yen Lin, and Chih-Jen Lin. Warmstart for parameter selection of linear classifiers. In Proceedings of the 21th ACM SIGKDDInternational Conference on Knowledge Discovery and Data Mining (KDD), 2015. URLhttp://www.csie.ntu.edu.tw/~cjlin/libsvmtools/warm-start/warm-start.pdf.[6] Wei-Sheng Chin, Yong Zhuang, Yu-Chin Juan, and Chih-Jen Lin. A learning-rate schedulefor stochastic gradient methods to matrix factorization. In Proceedings of the PacificAsiaConference on Knowledge Discovery and Data Mining (PAKDD), 2015. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/libmf/mf_adaptive_pakdd.pdf.[7] Yong Zhuang, Wei-Sheng Chin, Yu-Chin Juan, and Chih-Jen Lin. Distributed Newtonmethod for regularized logistic regression. In Proceedings of the Pacific-Asia Conferenceon Knowledge Discovery and Data Mining (PAKDD), 2015.[8] Chieh-Yen Lin, Cheng-Hao Tsai, Ching-Pei Lee, and Chih-Jen Lin. Large-scale logisticregression and linear support vector machines using Spark. In Proceedings of the IEEEInternational Conference on Big Data, pages 519–528, 2014. URL http://www.csie.http://ntu.edu.tw/~cjlin/papers/spark-liblinear/spark-liblinear.pdf.[9] Meng-Chieh Yu, Tong Yu, Shao-Chen Wang, Chih-Jen Lin, and Edward Y. Chang. Bigdata small footprint: The design of a low-power classifier for detecting transportationmodes. Proceedings of the VLDB Endowment, 7:1429–1440, 2014.[10] Cheng-Hao Tsai, Chieh-Yen Lin, and Chih-Jen Lin. Incremental and decremental trainingfor linear classification. In Proceedings of the 20th ACM SIGKDD InternationalConference on Knowledge Discovery and Data Mining, 2014. URL http://www.csie.http://ntu.edu.tw/~cjlin/papers/ws/inc-dec.pdf.[11] Tzu-Ming Kuo, Ching-Pei Lee, and Chih-Jen Lin. Large-scale kernel rankSVM. InProceedings of SIAM International Conference on Data Mining, 2014. URL http://http://www.csie.ntu.edu.tw/~cjlin/papers/ranksvm/kernel.pdf.[12] Yong Zhuang, Wei-Sheng Chin, Yu-Chin Juan, and Chih-Jen Lin. A fast parallel SGD formatrix factorization in shared memory systems. In Proceedings of the ACM RecommenderSystems, 2013. URL http://www.csie.ntu.edu.tw/~cjlin/papers/libmf.pdf.[13] Raffay Hamid, Dennis Decoste, and Chih-Jen Lin. Dense non-rigid point-matching usingrandom projections. In The IEEE Conference on Computer Vision and Pattern Recognition(CVPR), 2013.[14] Aditya Khosla, Raffay Hamid, Chih-Jen Lin, and Neel Sundaresan. Large-scale videosummarization using web-image priors. In Proceedings of the IEEE Conference on ComputerVision and Pattern Recognition (CVPR), 2013.[15] Guo-Xun Yuan, Chia-Hua Ho, and Chih-Jen Lin. An improved GLMNET for l1-regularized logistic regression. In Proceedings of the Seventeenth ACM SIGKDD InternationalConference on Knowledge Discovery and Data Mining, pages 33–41, 2011.[16] Chia-Hua Ho, Ming-Hen Tsai, and Chih-Jen Lin. Active learning and experimentaldesign with SVMs. In JMLR Workshop and Conference Proceedings: Workshop on ActiveLearning and Experimental Design, volume 16, pages 71–84, 2011. URL http://www.http://csie.ntu.edu.tw/~cjlin/papers/activelearning/activelearning.pdf.[17] Hsiang-Fu Yu, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. Large linear classifi-cation when data cannot fit in memory. In Proceedings of the Sixteenth ACM SIGKDDInternational Conference on Knowledge Discovery and Data Mining, pages 833–842, 2010.URL http://www.csie.ntu.edu.tw/~cjlin/papers/kdd_disk_decomposition.pdf.[18] Fang-Lan Huang, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. Iterative scalingand coordinate descent methods for maximum entropy. In Proceedings of the 47th AnnualMeeting of the Association of Computational Linguistics (ACL), 2009. Short paper.[19] Yin-Wen Chang and Chih-Jen Lin. Feature ranking using linear SVM. In JMLR Workshopand Conference Proceedings: Causation and Prediction Challenge (WCCI 2008),volume 3, pages 53–64, 2008. URL Index of /~cjlin/paperscausality.pdf.[20] Yangqiu Song, Wen-Yen Chen, Hongjie Bai, Chih-Jen Lin, and Edward Y. Chang.Parallel spectral clustering. In European Conference on Machine Learning and Principlesand Practice of Knowledge Discovery in Databases (ECML/PKDD), 2008. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/ecml08.pdf.[21] S. Sathiya Keerthi, Sellamanickam Sundararajan, Kai-Wei Chang, Cho-Jui Hsieh, andChih-Jen Lin. A sequential dual method for large scale multi-class linear SVMs. In Proceedingsof the Forteenth ACM SIGKDD International Conference on Knowledge Discoveryand Data Mining, pages 408–416, 2008. URL 國立臺灣大學 資訊工程學系~cjlin/papers/sdm_kdd.pdf.[22] Cho-Jui Hsieh, Kai-Wei Chang, Chih-Jen Lin, S. Sathiya Keerthi, and SellamanickamSundararajan. A dual coordinate descent method for large-scale linear SVM. In Proceedingsof the Twenty Fifth International Conference on Machine Learning (ICML), 2008.URL http://www.csie.ntu.edu.tw/~cjlin/papers/cddual.pdf.[23] Chih-Jen Lin, Ruby C. Weng, and S. Sathiya Keerthi. Trust region Newton method forlarge-scale logistic regression. In Proceedings of the 24th International Conference onMachine Learning (ICML), 2007. Software available at 國立臺灣大學 資訊工程學系~cjlin/liblinear.[24] Tzu-Kuo Huang, Chih-Jen Lin, and Ruby C. Weng. Ranking individuals by group comparisons.In Proceedings of the Twenty Third International Conference on Machine Learning(ICML), 2006.[25] Pai-Hsuen Chen, Rong-En Fan, and Chih-Jen Lin. Training support vector machines viasmo-type decomposition methods. In Proceedings of the 16th International Conferenceon Algorithmic Learning Theory (ALT 2005), pages 45–62, 2005.[26] Tzu-Kuo Huang, Ruby C. Weng, and Chih-Jen Lin. A generalized Bradley-Terry model:From group competition to individual skill. In Advances in Neural Information ProcessingSystems 17. MIT Press, Cambridge, MA, 2005.[27] Ting-Fan Wu, Chih-Jen Lin, and Ruby C. Weng. Probability estimates for multi-classclassification by pairwise coupling. In Sebastian Thrun, Lawrence Saul, and BernhardSch¨olkopf, editors, Advances in Neural Information Processing Systems 16. MIT Press,Cambridge, MA, 2004.[28] Kai-Min Chung, Wei-Chun Kao, Tony Sun, and Chih-Jen Lin. Decomposition methodsfor linear support vector machines. In Proceedings of ICASSP 2003, pages 868–871, 2003.[29] Ming-Wei Chang, Chih-Jen Lin, and Ruby C. Weng. Adaptive deterministic annealingfor two applications: competing SVR of switching dynamics and travelling salesmanproblems. In Proceedings of ICONIP 2002, pages 920–924, 2002.[30] Kai-Min Chung, Wei-Chun Kao, Tony Sun, Li-Lun Wang, and Chih-Jen Lin. Radiusmargin bounds for support vector machines with the RBF kernel. In Proceedings ofICONIP 2002, pages 893–897, 2002.[31] Ming-Wei Chang, Chih-Jen Lin, and Ruby C. Weng. Analysis of nonstationary timeseries using support vector machines. In Seong-Whan Lee and Alessandro Verri, editors,Proceedings of SVM 2002, Lecture Notes in Computer Science 2388, pages 160–170, NewYork, NY, USA, 2002. Springer-Verlag Inc.[32] Ming-Wei Chang, Chih-Jen Lin, and Ruby C. Weng. Analysis of switching dynamicswith competing support vector machines. In Proceedings of IJCNN, pages 2387–2392,2002.[33] Chih-Chung Chang and Chih-Jen Lin. IJCNN 2001 challenge: Generalization ability andtext decoding. In Proceedings of IJCNN. IEEE, 2001.[34] Shuo-Peng Liao, Hsuan-Tien Lin, and Chih-Jen Lin. A note on the decomposition methodsfor support vector regression. In Proceedings of IJCNN, 2001.[35] Chih-Chung Chang, Chih-Wei Hsu, and Chih-Jen Lin. The analysis of decompositionmethods for support vector machines. In Workshop on Support Vector Machines, IJCAI99,1999.[36] Chih-Jen Lin, Nestor Michelena, and Romesh Saigal. Topological fixture synthesis usingsemidefinite programming. In Proceedings of the Third World Congress of Structural andMultidisciplinary Optimization (WCSMO-3), May 17-21 1999.[37] Chih-Jen Lin. Preconditioning dense linear systems from large-scale semidefinite programmingproblems. In Proceedings of the Fifth Copper Mountain conference on iterativemethods, 1998.• BOOK CHAPTERS[1] L´eon Bottou and Chih-Jen Lin. Support vector machine solvers. In L´eon Bottou, OlivierChapelle, Dennis DeCoste, and Jason Weston, editors, Large Scale Kernel Machines, pages1–28. MIT Press, Cambridge, MA., 2007. URL Welcome to Chih-Jen Lin's Home Pagepapers/bottou_lin.pdf.[2] Yi-Wei Chen and Chih-Jen Lin. Combining SVMs with various feature selection strategies.In Isabelle Guyon, Steve Gunn, Masoud Nikravesh, and Lofti Zadeh, editors, Featureextraction, foundations and applications. Springer, 2006.[3] Soon-Yi Wu, Shu-Cherng Fang, and Chih-Jen Lin. Analytic center based cutting planemethod for linear semi-infinite programming. In M. Goberna and M. Lopez, editors,Semi-infinite programming: recent advances. Kluwer, 2001.[4] Chih-Jen Lin, Shu-Cherng Fang, and Soon-Yi Wu. A dual affine scaling based algorithmfor solving linear semi-infinite programming problems. In D. Z. Du and J. Sun, editors,Advances in Optimization and Application, pages 217–234. Kluwer Academic Publishers,1994.• TECHNICAL REPORTS:[1] Chih-Wei Hsu, Chih-Chung Chang, and Chih-Jen Lin. A practical guide to support vectorclassification. Technical report, Department of Computer Science, National Taiwan University,2003. URL http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf.[2] Hsuan-Tien Lin and Chih-Jen Lin. A study on sigmoid kernels for SVM and the trainingof non-PSD kernels by SMO-type methods. Technical report, Department of ComputerScience, National Taiwan University, 2003. URL Welcome to Chih-Jen Lin's Home Pagepapers/tanh.pdf.[3] Jen-Hao Lee and Chih-Jen Lin. Automatic model selection for support vector machines.Technical report, Department of Computer Science and Information Engineering, NationalTaiwan University, 2000.[4] Chih-Jen Lin. Study in Large-Scale optimization. PhD thesis, University of Michigan, AnnArbor, Michigan, 1998.[5] Chih-Jen Lin and Romesh Saigal. A predictor corrector method for semi-definite linearprogramming. Technical report, Department of Industrial and Operations Engineering,University of Michigan, Ann Arbor, MI 48109-2117, 1995.[6] Chih-Jen Lin and Romesh Saigal. An infeasible start predictor corrector method for semi–definite linear programming. Technical report, Department of Industrial and OperationsEngineering, University of Michigan, Ann Arbor, MI 48109-2117, 1995.• SOFTWARE1. LIBSVM: an integrated software for support vector classification and regression, releasedApril 2000. (with C.-C. Chang)(LIBSVM -- A Library for Support Vector Machines)More then 900,000 downloads from April 2000 to October 2016.About 28,000 Google Scholar citations (up to October 2016).2. LIBLINEAR: a library for large linear classification, released April 2007. (with my researchgroup)(LIBLINEAR -- A Library for Large Linear Classification)More then 150,000 downloads from April 2007 to October 2016.3. BSVM: a decomposition method for large-scale support vector machines, released February2000. (with C.-W. Hsu)(BSVM)4. TRON: a bound-constrained optimization software, released in May 1999. (with J. J.Mor´e)(http://www.mcs.anl.gov/~more/tron)5. ICFS: an incomplete Cholesky factorization for sparse matrices, released August 1998.(with J. J. Mor´e)(http://www.mcs.anl.gov/~more/icf)• INVITED TALKS AND MISCELLANEOUS PRESENTATIONS1. “Matrix factorization and factorization machines for recommender systems.” Keynoteat the 4th Workshop on Large-Scale Recommender Systems, Boston, September 20162. “When and when not to use distributedl machine learning?” Keynote at InternationalWinter School on Big Data, Bilbao, Spain, February 20163. “Large-scale Linear and Kernel Classification.” Invited talk at Microsoft Research IndiaSummer School 2015 on Machine Learning, June 15, 20154. “Matrix factorization and factorization machines for recommender systems.” Invitedtalk at SDM workshop on Machine Learning Methods on Recommender Systems, May2, 2015.5. “Big-data machine learning: status and challenges.” Invited talk at China R Conference,Hangzhou, China, October 29, 20146. “Experiences and lessons in developing machine learning software.” Invited talk at IndustryTrack, ACM Conference on Information and Knowledge Management (CIKM),Shanghai, November 4, 20147. “Large-scale linear classification: status and challenges.” Invited talk at San FranciscoMachine Learning Meetup, October 30, 20148. “Big-data machine learning.” Invited speech at eBay Data Summit, Shanghai, China,October 25, 20149. “Big-data analytics: challenges and opportunities.” Kenote speech at Taiwan DataScience Conference, Taipei, August 30, 2014.10. “Distributed data classification.” Invited talk at Workshop on New Learning Frameworksand Models for Big Data, ICML, June 25, 2014.11. “Distributed data classification.” Invited talk at Workshop on Scalable Data Analytics,PAKDD, May 13, 201412. “Large-scale machine learning.” Invited talk at International Conference on Big Dataand Cloud Computing, Xiamen, China, December 29, 2013.13. “Distributed Newton methods for CTR (Click Through Rate) prediction.” Invited talk atMysore park workshop on distributed computing for machine learning and optimization,India, December 19, 2013.14. “Recent advances in large-scale linear classification.” Invited talk at Asian Conferenceon Machine Learning, November 15, 201315. “Experiences and lessons in developing machine learning and data mining software,”Invited talk at China R Conference, Shanghai, China, November 2, 201316. “Optimization and machine learning.” Plenary talk at 11th EUROPT Workshop onAdvances in Continuous Optimization, Florence, Italy, June 26, 201317. “Optimization and machine learning,” 25th Simon Stevin Lecture, K. U. Leuven Optimizationin Engineering Center, Leuven, Belgium, January 17, 2013.18. “Machine learning software: design and practical use,” invited talk at Machine LearningSummer School (MLSS), Kyoto, August 2012.19. “Experiences and lessons in developing industry-strength machine learning and datamining software,” invited talk at Industry Practice Expo of ACM KDD 2012, Beijing,August 2012.20. “Machine learning software: design and practical use,” invited talk at Machine LearningSummer School (MLSS), Santa Cruz, July 2012.21. “Large-scale machine learning in distributed environments,” tutorial at ACM InternationalConference on Multimedia Retrieval, June, 201222. “Support vector machines and kernel methods,” invited tutorial at Asian Conference onMachine Learning, Tokyo, Japan, November 8, 201023. “Support vector machines and kernel methods,” plenary talk at International Workshopon Recent Trends in Learning, Computation, and Finance, Pohang, Korea, August 30,2010.24. “Training support vector machines: status and challenges,” invited speaker at ICML2008 Workshop on Large Scale Learning Challenge.25. “Training support vector machines: status and challenges,” invited speaker at GoogleMachine Learning Summit, May 2008.26. “Support vector machines,” invited tutorial speaker at Machine Learning Summer School(MLSS), Taipei, July 2006.27. “Training linear and non-linear SVMs,” invited talk at Workshop on Mathematics andMedical Diagnosis, Erice, Italy, July 2006.28. “Support vector machines for data classification,” invited tutorial at ICONIP 2005, Taiwan,October 30, 2005.29. “Optimization issues in training support vector machines,” the 16th international conferenceon Algorithmic Learning Theory, Singapore, October 9, 2005 (invited talk).30. “Support vector machines for data classification,” XXXVI Annual Conference of the ItalianOperational Research Society, Camerino, Italy, September 8, 2005 (invited plenarytalk).31. “Generalized Bradley-Terry model and multi-class probability estimates,” ISI (InternationalStatistical Institute) 2005, Australia, April 6, 2005 (talk in an invited session).32. “Report on NIPS 2003 Feature Selection Competition,” NIPS workshop on feature selectioncompetition, Canada, December 12, 2003.33. “Optimization techniques for data mining and machine learning,” invited talk in Workshopon Optimization and Control, National Cheng Kung University, Tainan, Taiwan,January 6, 2003.34. “Support vector machines for time series segmentation,” invited talk in the 2002 TaipeiInternational Statistical Symposium and Bernoulli Society EAPR Conference, Taipei,July 7-10, 2002.35. “Support vector machines for protein classification/prediction,” invited talk at the 8thSymposium on Recent Advances in Biophysics, Taipei, May 23, 2002.36. “Automatic model selection using the decomposition methods,” NIPS workshop on kernelmethods, Breckenridge, CO, December 1, 2000.37. “Newton’s method for support vector machines.” Talk at the Sixth SIAM Conferenceon Optimization, Atlanta, May 1999.38. “Structural optimization and semidefinite programming,” Talk at INFORMS Fall meeting,Seattle, October 1998.39. “Preconditioning dense linear systems from large-scale semidefinite programming problems,”Talk at the Fifth Copper Mountain Conference on Iterative Methods, CopperMountain, Colorado, April, 1998.40. “Incomplete Cholesky factorizations with limited memory.” Talk at the Fourth KalamazooSymposium on Matrix Analysis & Applications, Kalamazoo, MI, October, 1997.41. “Newton’s method for large bound-constrained optimization problems.” Talk at InternationalSymposium on Mathematical Programming, Lausanne, Switzerland , August,1997.42. “An unconstrained convex programming approach for solving linear semi-infinite programmingproblems.” Talk at International Symposium on Mathematical Programming,Lausanne, Switzerland , August, 1997.43. “An infeasible start predictor corrector method for semidefinite linear programming .”Talk at Fifth SIAM Optimization Conference, Victoria, British Columbia, Canada, May1996.• ACADEMIC SERVICES1. Editorial services– Action editor, Data Mining and Knowledge Discovery, 2009–– Editorial board member, ACM Transactions on Intelligent Systems and Technology,2012–– Associate editor, IEEE Transactions on Neural Networks, 2005–2010– Associate editor, Journal of Information Science and Engineering, 2009–2013– Guest editor: special issue on Support Vector Machines, Neurocomputing, 2003.2. Reviewer for the following journals– Journal of Machine Learning Research– Machine Learning– Neural Computation– SIAM Journal on Matrix Analysis and Applications– SIAM Journal on Optimization– IEEE Transactions on Neural Networks– IEEE Transactions on Pattern Analysis and Machine Intelligence– IEEE Transactions on Knowledge and Data Engineering– IEEE Transactions on Big Data– IEEE Transactions on Fuzzy Systems– IEEE Transactions on Image Processing– IEEE Transactions on Signal Processing– IEEE Signal Processing Letters– IEEE Transactions on Evolutionary Computation– IEEE Transactions on Systems, Man, and Cybernetics– IEEE Transactions on Semiconductor Manufacturing– IEEE Transactions on Antennas and Propagation– IEEE Transactions on Automation Science and Engineering– IEEE Transactions on Audio, Speech and Language Processing– Biometrika– Neurocomputing– Bioinformatics– BMC Bioinformatics– Theory of Computing Systems– Neural Processing Letters– Signal Processing– International Journal of Pattern Recognition and Artificial Intelligence (IJPRAI)– Artificial Intelligence Review– Pattern Analysis & Applications– Computational Intelligence and Neuroscience– IIE Transactions– Annals of the Institute of Statistical Mathematics– Journal of Statistical Planning and Inference– Statistics and Computing– Communications in Statistics– Pattern Recognition– Pattern Recognition Letters– Knowledge and Information Systems– Computational Optimization and its Applications– INFORMS Journal on Computing– Journal of Global Optimization– Optimization– Numerical Algorithms– Information Processing and Management– Internet Electronic Journal of Molecular Design– International Journal of Operations and Quantitative Management– International Journal of Computer Mathematics– Journal of Information Science and Engineering– Journal of Computer Science and Technology (JCST)– Journal of Formosan Medical Association– Journal of Chinese Institute of Industrial Engineers– Journal of the Chinese Institute of Engineers– Journal of the Chinese Institute of Electrical Engineering3. Reviewer for several book chapters4. Conference chair, Area chair, or senior program committee member:– General co-chair, Mysore park workshop on distributed computing for machine learningand optimization, India, 2013– Senior PC, ACM SIGKDD international conference on Knowledge discovery anddata mining (KDD), 2013, 2014, 2015, 2016– Senior PC, SIAM international conference on data mining (SDM), 2017– Area chair, Neural Information Processing Systems (NIPS) 2007, 2010, 2011, 2015– Area chair, International Conference on Machine Learning (ICML), 2016, 2017– General co-chair, Asian Conference on Machine Learning (ACML) 2011– Senior PC, Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD),2013, 2014, 2015, 2016– Senior PC, AAAI 2017– Senior PC, IJCAI 2011 (IEAI track)– Senior PC, Asian Conference on Machine Learning (ACML) 20105. Program committee member:– AAAI 2016– International Joint Conference on Artificial Intelligence (IJCAI) , 2015– Workshop on Large-Scale Recommender Systems at ACM RecSys, 2014– The International Workshop on advances in Regularization, Optimization, Kernelmethods and Support vector machines: theory and applications (ROKS-2013), Belgium.– ACM SIGKDD international conference on Knowledge discovery and data mining(KDD), Washington D.C. 2010, San Diego, 2011, Beijing, 2012– SIAM international conference on data mining (SDM), 2013– AI & Statistics 2010, 2014– NIPS Workshop on Optimization for Machine Learning (2008, 2009, 2010, 2011,2012, 2013)– NIPS Workshop on AutoML (2014)– International Conference on Machine Learning (ICML), Helsinki 2008, Montreal2009, Haifa 2010, Bellevue, WA 2011, Scotland, 2012, Atlanta, 2013, Beijing, 2014– European Conference on Machine Learning (ECML) and European Conference onPrinciples and Practice of Knowledge Discovery in Databases (PKDD), 2008, 2010,– International Joint Conference on Neural Networks (IJCNN), Hong Kong 2008, SanJose, CA 2011– Pacific-Rim Conference on Multimedia (PCM), Hong Kong 2007, Bangkok, Thailand2009, Shanghai, China 2010.– IEEE International Conference on Multimedia & Expo (ICME), Beijing 2007, Hannover2008.– Asian Conference on Machine Learning (ACML), 2009– NIPS 2006 Workshop on Machine Learning Open Source Software.– ACM Multimedia Conference (ACM MM), Santa Barbara 2006– International Colloquium on Grammatical Inference (ICGI), Japan 2006– International workshops on Statistical Techniques in Pattern Recognition (SPR),Hong Kong 2006, Orlando, Florida, 2008, Turkey, 2010.– Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD), Singapore2006, China 2007, Osaka, Japan 2008, Thailand 2009– International Conference on Neural Information Processing (ICONIP), India 2004,Hong Kong 2006– International Workshop on Pattern Recognition with Support Vector Machines (SVM2002),Canada– Fourth Asia-Pacific Conference on Industrial Engineering and Management Systems,2002, Taiwan6. Reviewer for the following conferences– Neural Information Processing Systems (NIPS), 2003, 2004, 2005, 2006, 2014, 2016– Conference on Learning Theory (COLT), 2003, 2009– International Joint Conference on Neural Networks (IJCNN), 2003, 2004, 2005– IEEE International Conference on Multimedia & Expo (ICME), 2009– First Asia-Pacific Bioinformatics Conference, Australia, 2003– The Seventh Pacific Rim International Conference on Artificial Intelligence, (PRICAI-02)7. Other conference planning and administration– Special session organizer, ICONIP 2002, Singapore8. Thesis External Reviewers:– University of Trento, Italy: Nicola Segata (Ph.D. 2009)– Jin Yu: Australian National University (Ph.D. 2009)– Ruhr-Universit¨at Bochum: Tobias Glasmachers (Ph.D. 2008)– Hongkong University of Science and Techonology: Ivor Tsang (Ph.D. 2007)– National University of Singapore: Chu Wei (Ph.D. 2003), Kaibo Duan (Ph.D. 2003),Jianbo Yang (Ph.D., 2011)– Nanyang Technological University, Mingkui Tan (Ph.D. 2014)– Chinese University of Hongkong: Wan Zhang (M. Phil. 2003)9. Proposal Reviewers:– Research Grants Council, Hong Kong, 2006, 2007, 2008, 2009, 2010– American University of Beirut, 2009– Czech Science Foundation, 201010. Other Services:– IEEE CS society fellow evaluation committee member (2011, 2012)• TALKS IN ACADEMIC INSTITUTES AND INDUSTRY– International:1. Microsoft, Redmond, Washington, October 6, 20162. Guangdong University of Technology, Guangzhou, China, June 20, 20163. Samsung Research America, California, June 10, 20164. UC Davis, California, May 4, 20165. Netflix, California, May 3, 20166. Huawei Research Labs, Shenzhen, China, April 19, 20167. Chinese University of Hong Kong, Shenzhen, China, April 18, 20168. Facebook, California, November 13, 20159. Quora, California, November 12, 201510. Nanjing University, December 25, 201411. University of Electronic Science and Technology of China, Chengdu, China, November30 and December 1, 2014 (two talks)12. Twitter, California, October 31, 201413. eBay China, October 24, 201414. Microsoft Research, New York City, August 22, 2014 (open machine learning softwareworkshop)15. Microsoft, Redmond, Washington, August 18, 201416. Criteo, California, August 1, 201417. Databricks, California, July 31, 201418. Research Center on Fictitious Economy and Data Science, Chinese Academy ofSciences, Beijing, China, June 27, 201419. Institute of Computational Mathematics and Scientific/Engineering Computing, ChineseAcademy of Sciences, Beijing, China, June 26, 201420. Samsung Research America, California, May 23, 201421. Walmart Labs, California, April 23, 201422. Pandora, California, April 22, 201423. Alibaba, Hangzhou, China, December 27, 201324. eBay China, November 1, 201325. Shanghai Jiao Tong University, October 31, 201326. Microsoft Research, Redmond, August 15, 201327. University of Rome “La Sapienza,” June 25, 201328. K. U. Leuven, Belgium, January 14-16, 201329. Baidu, China, October 24, 201230. Luminescent technology, California, August 24, 201231. eBay Machine Learning Forum, San Jose, California, February 17, 201232. City University of Hong Kong, December 30, 201133. NEC Labs, Cupertino, California, August 26, 201134. Adobe, California, August 25, 201135. eBay research, San Jose, California, December 7, 201036. Facebook, Palo Alto, California, December 6, 201037. Baidu, China, September 3, 201038. Google Research New York, July 29, 201039. Yahoo! Research, Santa Clara, California, July 23, 201040. China Agriculture University, October 16, 200941. Microsoft Research Asia, October 13, 200942. Department of Computer Science and Engineering, Hong Kong University of Scienceand Technology, February 5, 200943. Department of Computer Science and Technology, Tsinghua University, China, September5, 200844. HP Labs China, June 26, 200845. IBM T. J. Watson Research Center, May 16, 200846. Department of Industrial and Operations Engineering, University of Michigan, August15, 200747. Yahoo! Research, Santa Clara, California, February 20, 200748. NEC Labs, Princeton, New Jersey, February 15, 200749. Siemens Corporate Research, Princeton, New Jersey, February 14, 200750. AT&T Research, February 13, 200751. California Institute of Technology, November 14, 200652. School of Information and Computer Science, University of California, Irvine, November6, 200653. Yahoo! Research, Burbank, California, August 30, 200654. Mathematics and computer science division, Argonne National Lab., June 23, 200655. Chinese University of Hong Kong, Hone Kong, December 12, 200556. Nanyang Technological University, Singapore, October 10, 200557. Universit`a di Roma ”La Sapienza” and Isituto di Analisi dei Sistemi ed Informaticadel CNR, Italy, September 1-2, 2005 (a short course).58. CWI (Dutch National Research Institute for Mathematics and Computer Science),February 9, 200459. Department of Electronics and Computer Science, University of Southampton, February2-6, 2004 (two talks)60. Department of Computer Science, University of Essex, January 22, 200461. Department of Statistics and Probability Theory, Vienna University of Technology,September 4, 200362. Fraunhofer Institute for Computer Architecture and Software Technology, Germany,August 18, 200363. Department of Computer Science, University of Essex, August 13, 200364. University of Freiburg, Germany, July 15, 200365. Max Planck Institute of Biological Cybernetics, Germany, July 9, 200366. University of Tuebingen, Germany, July 8, 200367. KXEN Corporation, Suresnes, France, February 17, 200368. Max Planck Institute of Informatics (Computer Science), Germany, February 10-16,2003 (two talks)69. Max Planck Institute of Biological Cybernetics, Germany, January 12-February 10,2003 (three talks)70. Department of Electrical and Computer Engineering, University of Michigan-Dearborn,August 27, 200271. Siemens Corporate Research, Princeton, New Jersey, August 21, 200272. Department of Computer Science, Binghamton University, August 19, 200273. Merck research Lab., New Jersey, August 16, 200274. Agilent Inc., Colorado, July 31, 200175. Ford Research Lab., Michigan, July 24, 200176. Department of Electrical Engineering, Ohio State University, August 29, 2000.– Domestic:1. Institute of Statistics, National Tsing Hua University, April 29, 20162. Industrial Technology Research Institute, October 7, 8, 21, and 22, 2015 (a shortcourse on data mining)3. Industrial Technology Research Institute, July 18, 22, 24, and August 12, 2014 (ashort course on data mining)4. Interdisplinary Science Program, National Chiao Tung University, March 28, 20145. Institute of Biomedical Electronics and Bioinformatics, National Taiwan University,September 24, 20126. Department of Mathematics, National Taiwan University, October 17, 20117. Department of Financial and Computational Mathematics, Providence University,September 22, 20118. Department of Mathematics, National Taiwan Normal University, April 20, 20119. Department of Applied Informatics, Fo Guang University, April 14, 201110. Department of Information Management, National Taiwan University, February 25,11. Institute of Information Science, Academia Sinica, February 16, 201112. Department Computer Science and Information Engineering, Chaoyang Universityof Technology, October 29, 201013. Graduate Institute of Communication Engineering, National Taiwan University,September 27, 201014. Department Computer Science and Information Engineering, National Central University,November 12, 200815. Department of Information Management, Chaoyang University of Technology, October30, 200716. Department Computer Science and Information Engineering, National Cheng-KungUniversity, October 26, 200717. Department of Computer Science, National Chengchi University, November 10, 200518. Department of Computer Science, National Chi-Nan University, September 24, 200419. Institute of Information Science, Academia Sinica, April 15, 200420. Department of Statistics, National Chiao Tung University, April 9, 200421. Computer and Communications Research Laboratories, Industrial Technology ResearchInstitute, February 27 and March 3, 2004 (8 hours)22. Computer and Communications Research Laboratories, Industrial Technology ResearchInstitute, November 18, 200323. Department of Information Management, Chaoyang University of Technology, November4, 200324. Graduate Institute of Industrial Engineering, National Taiwan University, April 23,25. Department Mathematics, National Taiwan University, March 10, 200326. Department of Information and Computer Engineering, Chung Yuan Christian University,December 16, 200227. Department of Statistics, Feng Chia University, November 1, 200228. Department of Statistics, National Chengche University, October 14, 200229. Asian BioInnovations Corporation, Taipei, June 14, 200230. Graduate Program in Bioinformatics, National Yang Ming University, March 29,31. Department of Information Science and Management, Providence University, March22, 200232. Department of Computer Science and Information Engineering, National TaiwanUniversity of Science and Technology, March 11, 200233. Institute of Statistical Science, Academia Sinica, January 16, 200234. Department Mathematics, National Taiwan University, January 5, 200235. Institute of Computer Science and Information Engineering, Chang Gung University,December 4, 2001.36. Graduate Institute of Medical Informatics, Taipei Medical University, November 22,2001.37. Department of Information Management, National Taichung Institute of Technology,October 23, 2001.38. Graduate Institute of Industrial Engineering, National Taiwan University, October3, 200139. Department of Information Management, National Taiwan University of Science andTechnology, September 27, 200140. Department of Biological Science and Technology, National Chiao Tung University,September 26, 200141. Institute of Information Science, Academia Sinica, August 28-29, 200142. Department Computer Science and Information Engineering, National Cheng-KungUniversity, May 25, 200143. Department of Information and Computer Education, National Taiwan Normal University,April 9, 200144. Institute of Statistical Science, Academia Sinica, February 19, 200145. Department Computer Science and Information Engineering, National Central University,January 17, 200146. Division of Biostatistics and Bioinformatics, National Health Research Institutes,December 6, 200047. Institute of Biochemistry, National Yang-Ming University, June 5, 200048. Department Computer Science and Information Engineering, National Chung ChengUniversity, May 22, 200049. Institute of Information Science, Academia Sinica, November 19, 199950. Department of Computer Science, National Tsing Hua University, June 2, 199951. Department Computer Science and Information Engineering, National Taiwan University,March 5, 199952. Department of Industrial Engineering, National Tsing Hua University, December 24,53. Department Computer Science and Information Engineering, National Taiwan University,December 26, 199754. Department Mathematics, National Cheng-Kung University, December 19, 199755. Institute of Information Management, National Chi-Nan University, December 18,56. Department of Industrial Engineering, National Tsing Hua University, December 17,57. Department Mathematics, National Cheng-Kung University, May, 1997• TEACHING EXPERIENCE1. Operations Research (Fall 1998, Fall 1999, Fall 2000)2. Scientific computing (Winter 1999, Winter 2000)3. Numerical methods (Winter 2001, Winter 2002, Winter 2003, Winter 2009, Winter 2010,Winter 2011, Winter 2013, Winter 2014, Winter 2016)4. Statistical learning theory (Fall 1999, Fall 2000, Fall 2001, Fall 2002, Fall 2003, Fall 2004,Fall 2005)5. Data mining and machine learning (Fall 2001, Fall 2002, Winter 2004, Winter 2005,Winter 2006, Winter 2007)6. Introduction to the theory of computation (Fall 2003, Fall 2004, Fall 2005, Fall 2007,Fall 2008, Fall 2009, Fall 2010, Fall 2011, Fall 2012, Fall 2013, Fall 2014, Fall 2015)7. Machine learning: theory and practice (Winter 2007, Winter 2010, Winter 2013)8. Optimization and machine learning (Fall 2010, Fall 2011, Winter 2014, Fall 2015)• MEMBERSHIPS: IEEE (fellow), ACM (fellow), AAAI (fellow)That’s all folks !! Hope you had a nice trip. :)

Feedbacks from Our Clients

Easy to use. My clients seemed to have no problem with it

Justin Miller