Schedule C 4 Form 886 A May 2017: Fill & Download for Free

GET FORM

Download the form

How to Edit Your Schedule C 4 Form 886 A May 2017 Online Free of Hassle

Follow these steps to get your Schedule C 4 Form 886 A May 2017 edited with the smooth experience:

  • Select the Get Form button on this page.
  • You will enter into our PDF editor.
  • Edit your file with our easy-to-use features, like adding checkmark, erasing, and other tools in the top toolbar.
  • Hit the Download button and download your all-set document for reference in the future.
Get Form

Download the form

We Are Proud of Letting You Edit Schedule C 4 Form 886 A May 2017 With the Best Experience

Get Our Best PDF Editor for Schedule C 4 Form 886 A May 2017

Get Form

Download the form

How to Edit Your Schedule C 4 Form 886 A May 2017 Online

When you edit your document, you may need to add text, put on the date, and do other editing. CocoDoc makes it very easy to edit your form fast than ever. Let's see how to finish your work quickly.

  • Select the Get Form button on this page.
  • You will enter into CocoDoc PDF editor webpage.
  • Once you enter into our editor, click the tool icon in the top toolbar to edit your form, like checking and highlighting.
  • To add date, click the Date icon, hold and drag the generated date to the field you need to fill in.
  • Change the default date by deleting the default and inserting a desired date in the box.
  • Click OK to verify your added date and click the Download button for the different purpose.

How to Edit Text for Your Schedule C 4 Form 886 A May 2017 with Adobe DC on Windows

Adobe DC on Windows is a popular tool to edit your file on a PC. This is especially useful when you like doing work about file edit without network. So, let'get started.

  • Find and open the Adobe DC app on Windows.
  • Find and click the Edit PDF tool.
  • Click the Select a File button and upload a file for editing.
  • Click a text box to modify the text font, size, and other formats.
  • Select File > Save or File > Save As to verify your change to Schedule C 4 Form 886 A May 2017.

How to Edit Your Schedule C 4 Form 886 A May 2017 With Adobe Dc on Mac

  • Find the intended file to be edited and Open it with the Adobe DC for Mac.
  • Navigate to and click Edit PDF from the right position.
  • Edit your form as needed by selecting the tool from the top toolbar.
  • Click the Fill & Sign tool and select the Sign icon in the top toolbar to make you own signature.
  • Select File > Save save all editing.

How to Edit your Schedule C 4 Form 886 A May 2017 from G Suite with CocoDoc

Like using G Suite for your work to sign a form? You can make changes to you form in Google Drive with CocoDoc, so you can fill out your PDF with a streamlined procedure.

  • Add CocoDoc for Google Drive add-on.
  • In the Drive, browse through a form to be filed and right click it and select Open With.
  • Select the CocoDoc PDF option, and allow your Google account to integrate into CocoDoc in the popup windows.
  • Choose the PDF Editor option to begin your filling process.
  • Click the tool in the top toolbar to edit your Schedule C 4 Form 886 A May 2017 on the field to be filled, like signing and adding text.
  • Click the Download button in the case you may lost the change.

PDF Editor FAQ

Who has the world's greatest CV ever?

This is Mr. Chih-Jen LinMy browser got stuck for a while when I tried to paste this CV here.Anyway, so Mr. Chih-Jen Lin is a notable person who has contributed a lot in the field of Machine Learning and related areas. He is a Professor of Computer Science at National Taiwan University, and a leading researcher in machine learning, optimization, and data mining. He is best known for the open source library LIBSVM, an implementation of support vector machines.This is his home page url : Welcome to Chih-Jen Lin's Home PageAnd here comes his 28 pages long simple CV : http://www.csie.ntu.edu.tw/~cjlin/resume.pdfBrace yourself and scroll down !!Chih-Jen Lin• PERSONAL DATA1. Address: Department of Computer Science and Information Engineering, National TaiwanUniversity, Taipei 106, Taiwan2. Phone: (886) 2-33664923, Fax: (886) 2-236281673. E-mail: [email protected]. Homepage: Welcome to Chih-Jen Lin's Home Page• EDUCATION AND CURRENT POSITION:1. Distinguished professor, Department of Computer Science and Information Engineering,National Taiwan University, Taipei 106, Taiwan, 2011–present2. Adjunct distinguished professor, Graduate Institute of Networking and Multimedia, NationalTaiwan University, Taipei 106, Taiwan, August 2011–present3. Adjunct distinguished professor, Graduate Institute of Industrial Engineering, NationalTaiwan University, Taipei 106, Taiwan, August 2011–present4. Ph.D., Industrial & Operations Engineering, University of Michigan, September 1995 –May 1998.5. M.S.E., Industrial & Operations Engineering, University of Michigan, September 1995–December 1996.6. B.S., Mathematics, National Taiwan University, October 1989–June 1993.• RESEARCH INTERESTS:1. Machine learning: support vector machines, large-scale data classification, and applicationsWe develop popular machine learning software including LIBSVM (http://www.csie.http://ntu.edu.tw/~cjlin/libsvm) and LIBLINEAR (Welcome to Chih-Jen Lin's Home Pageliblinear).According to Most Cited Articles in Computer Science, LIBSVM is amongthe 10 most cited computer science works at all time.2. Large-scale optimization and its applications• AWARDS AND RECOGNITION:– International:1. ACM fellow, 20152. AAAI fellow, 20143. Best paper award, ACM Recommender Systems 2013 (with students Yong Zhuang,Wei-Sheng Chin, and Yu-Chin Juan)4. ACM Distinguished Scientist, 20115. IEEE fellow (class of 2011) for contributions to support vector machine algorithmsand software.6. Member of the NTU team to win the first prize of KDD cup 2010, 2011 and 2013.7. Best research paper award, ACM KDD 2010 (with students Hsiang-Fu Yu, Cho-JuiHsieh, and Kai-Wei Chang).8. Supervising students Chia-Hua Ho and Ming-Hen Tsai to win the 2nd place of ActiveLearning Challenge 2010.Active learning - Causality Workbench9. Member of the NTU team to win the 3rd place of KDD cup 2009 (extended track) .10. Winner of ICML 2008 large-scale learning challenge (linear SVM track; with studentsHsiang-Fu Yu, Cho-Jui Hsieh, and Kai-Wei Chang).http://largescale.first.fraunhofer.de/summary/11. Supervising student Yin-Wen Chang to win WCCI 2008 Causation and Predictionchallenge.Causality Workbench12. Winner of WCCI 2002 competition on sequence recognition (with master studentsMing-Wei Chang and Bo-Juen Chen)13. Winner the EUNITE 2001 world wide competition (18 research groups) on electricityload prediction (Electricity Load Forecast using Intelligent Adaptive Technology). EUNITE is the EuropeanNetwork of Excellence on Intelligent Technology for smart adaptive systems(with master students Ming-Wei Chang and Bo-Juen Chen).14. Winner of IJCNN Challenge 2001. IJCNN is one of the major Neural Networksconferences (with master student Chih-Chung Chang).15. Winner of the OCR (Optical Character Recognition) competition organized by theUniversity of Essex and the UK Post Office, December 2000. (with master studentChih-Chung Chang)16. Second prize of the student paper competition, Fifth Copper Mountain conferenceon iterative methods, 1998.17. Wallace J. Givens Research Associate (twice): competitive positions in Mathematicsand Computer Science Division of Argonne National Laboratory which are intendedto encourage graduate students who are beginning careers in computational science.– Domestic:1. Outstanding research award of Pan Wen Yuan Foundation, Taiwan, 20162. Pegatron Chair Professorship, 20163. Teco Award, 20154. Macronix International Co. Chair Professorship, 20145. K. T. Li Breakthrough Award, Institute of Information & Computing Machinery,Taiwan, 20126. NTU EECS Academic Excellence Award, NTU College of EECS, 2011.7. Ten outstanding young persons of Taiwan, 20118. Distinguished Scholar Research Project, National Science Council, Taiwan, 2009–2012.9. Outstanding Research Award, National Science Council, Taiwan, 2007, 2010, and2013.10. Ta-You Wu Memorial Award, National Science Council, Taiwan, 2006.11. Fu Ssu-Nien Award of National Taiwan University, 200512. Research award for young researchers from Pan Wen-Yuan Foundation, Taiwan,2003.13. K. T. Li award for young researchers from ACM Taipei/Taiwan chapter, July, 2002(one awarded per year for young computer scientists in Taiwan)14. Young investigator award from Academia Sinica, Taiwan, May, 2002 (15 awardedper year in all research areas)15. Prize for Outstanding Performance, National Mathematics Contest, R.O.C. 1989.• PROFESSIONAL EXPERIENCE:1. Visiting researcher, Microsoft, January 2015 – September 2015, August 2016 – February2017.2. Visiting principal research scientist, eBay Research Labs, January 2012 – September2012.3. Visiting scientist, Google Research, February 2008 – September 2008.4. Visiting scientist, Yahoo! Research, Burbank, California, August 2006 – February 2007.5. Distinguished Professor (August 2011–present), Professor (August 2006–present), AssociateProfessor (August 2002–August 2006), Assistant Professor (August 1998–August2002), Department of Computer Science and Information Engineering, National TaiwanUniversity, Taipei 106, Taiwan.6. Adjunct Associate Professor, Graduate Institute of Networking and Multimedia, NationalTaiwan University, Taipei 106, Taiwan, August 2004– August 20067. Adjunct Associate Professor (August 2002–August 2006), Adjunct Assistant Professor(August 2001–August 2002), Graduate Institute of Industrial Engineering, National TaiwanUniversity, Taipei 106, Taiwan.8. Visiting Scientist, Mathematics and Computer Science division, Argonne National Laboratory,January 1999–February 1999, May 1999–August 1999.9. Research Associate, Mathematics and Computer Science division, Argonne National Laboratory,January 1997–April 1997, September 1997–September 1998.10. Wallace J. Givens Research Associate, Mathematics and Computer Science division,Argonne National Laboratory, May 1996–August 1996 and May 1997–August 1997.11. Research Assistant, Department of Industrial and Operations Engineering, University ofMichigan, September 1995–August 1998.12. Teaching Assistant, Department of Industrial and Operations Engineering, University ofMichigan, September 1996–December 1996.13. Second Lieutenant, R.O.C. Army, July 1993 – May 1995.• JOURNAL PAPERS:[1] Wei-Sheng Chin, Bo-Wen Yuan, Meng-Yuan Yang, Yong Zhuang, Yu-Chin Juan, andChih-Jen Lin. LIBMF: A library for parallel matrix factorization in shared-memorysystems. Journal of Machine Learning Research, 17(86):1–5, 2016. URL https://www.http://csie.ntu.edu.tw/~cjlin/papers/libmf/libmf_open_source.pdf.[2] Wei-Sheng Chin, Yong Zhuang, Yu-Chin Juan, and Chih-Jen Lin. A fast parallelstochastic gradient method for matrix factorization in shared memory systems. ACMTransactions on Intelligent Systems and Technology, 6:2:1–2:24, 2015. URL http://www.csie.ntu.edu.tw/~cjlin/papers/libmf/libmf_journal.pdf.[3] Chien-Chih Wang, Chun-Heng Huang, and Chih-Jen Lin. Subsampled Hessian Newtonmethods for supervised learning. Neural Computation, 27:1766–1795, 2015. URL http://www.csie.ntu.edu.tw/~cjlin/papers/sub_hessian/sample_hessian.pdf.[4] Po-Wei Wang and Chih-Jen Lin. Iteration complexity of feasible descent methods forconvex optimization. Journal of Machine Learning Research, 15:1523–1548, 2014. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/cdlinear.pdf.[5] Ching-Pei Lee and Chih-Jen Lin. Large-scale linear rankSVM. Neural Computation,26(4):781–817, 2014. URL Index of /~cjlin/papers/ranksvmranksvml2.pdf.[6] Ching-Pei Lee and Chih-Jen Lin. A study on L2-loss (squared hinge-loss) multi-classSVM. Neural Computation, 25(5):1302–1323, 2013. URL http://www.csie.ntu.edu.tw/~cjlin/papers/l2mcsvm/l2mcsvm.pdf.[7] Chia-Hua Ho and Chih-Jen Lin. Large-scale linear support vector regression. Journal ofMachine Learning Research, 13:3323–3348, 2012. URL 國立臺灣大學 資訊工程學系~cjlin/papers/linear-svr.pdf.[8] Guo-Xun Yuan, Chia-Hua Ho, and Chih-Jen Lin. An improved GLMNET for l1-regularized logistic regression. Journal of Machine Learning Research, 13:1999–2030,2012. URL http://www.csie.ntu.edu.tw/~cjlin/papers/l1_glmnet/long-glmnet.pdf.[9] Guo-Xun Yuan, Chia-Hua Ho, and Chih-Jen Lin. Recent advances of large-scale linearclassification. Proceedings of the IEEE, 100(9):2584–2603, 2012. URL http://www.csie.http://ntu.edu.tw/~cjlin/papers/survey-linear.pdf.[10] Hsiang-Fu Yu, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. Large linear classificationwhen data cannot fit in memory. ACM Transactions on Knowledge Discoveryfrom Data, 5(4):23:1–23:23, February 2012. URL 國立臺灣大學 資訊工程學系~cjlin/papers/kdd_disk_decomposition.pdf.[11] Chih-Chung Chang and Chih-Jen Lin. LIBSVM: a library for support vector machines.ACM Transactions on Intelligent Systems and Technology, 2(3):27:1–27:27, 2011. Softwareavailable at LIBSVM -- A Library for Support Vector Machines.[12] Wen-Yen Chen, Yangqiu Song, Hongjie Bai, Chih-Jen Lin, and Edward Y. Chang. Parallelspectral clustering in distributed systems. IEEE Transactions on Pattern Analysisand Machine Intelligence, 33(3):568–586, 2011.[13] Ruby C. Weng and Chih-Jen Lin. A Bayesian approximation method for online ranking.Journal of Machine Learning Research, 12:267–300, 2011. URL http://www.csie.ntu.http://edu.tw/~cjlin/papers/online_ranking/online_journal.pdf.[14] Hsiang-Fu Yu, Fang-Lan Huang, and Chih-Jen Lin. Dual coordinate descent methodsfor logistic regression and maximum entropy models. Machine Learning, 85(1-2):41–75,October 2011. URL http://www.csie.ntu.edu.tw/~cjlin/papers/maxent_dual.pdf.[15] Guo-Xun Yuan, Kai-Wei Chang, Cho-Jui Hsieh, and Chih-Jen Lin. A comparison of optimizationmethods and software for large-scale l1-regularized linear classification. Journalof Machine Learning Research, 11:3183–3234, 2010. URL http://www.csie.ntu.edu.tw/~cjlin/papers/l1.pdf.[16] Yin-Wen Chang, Cho-Jui Hsieh, Kai-Wei Chang, Michael Ringgaard, and Chih-Jen Lin.Training and testing low-degree polynomial data mappings via linear SVM. Journal ofMachine Learning Research, 11:1471–1490, 2010. URL 國立臺灣大學 資訊工程學系~cjlin/papers/lowpoly_journal.pdf.[17] Fang-Lan Huang, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. Iterative scalingand coordinate descent methods for maximum entropy. Journal of Machine LearningResearch, 11:815–848, 2010. URL Index of /~cjlin/papersmaxent_journal.pdf.[18] Chih-Jen Lin, Stefano Lucidi, Laura Palagi, Arnaldo Risi, and Marco Sciandrone. Decompositionalgorithm model for singly linearly constrained problems subject to lowerand upper bounds. Journal of Optimization Theory and Applications, 141:107–126, 2009.[19] Tzu-Kuo Huang, Chih-Jen Lin, and Ruby C. Weng. Ranking individuals by groupcomparisons. Journal of Machine Learning Research, 9:2187–2216, 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/genBTexp/genBTexp-jmlr.pdf.[20] Rong-En Fan, Kai-Wei Chang, Cho-Jui Hsieh, Xiang-Rui Wang, and Chih-Jen Lin. LIBLINEAR:a library for large linear classification. Journal of Machine Learning Research,9:1871–1874, 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/liblinear.pdf.[21] Kai-Wei Chang, Cho-Jui Hsieh, and Chih-Jen Lin. Coordinate descent method for largescaleL2-loss linear SVM. Journal of Machine Learning Research, 9:1369–1398, 2008.URL http://www.csie.ntu.edu.tw/~cjlin/papers/cdl2.pdf.[22] Chih-Jen Lin, Ruby C. Weng, and S. Sathiya Keerthi. Trust region Newton method forlarge-scale logistic regression. Journal of Machine Learning Research, 9:627–650, 2008.URL http://www.csie.ntu.edu.tw/~cjlin/papers/logistic.pdf.[23] Hsuan-Tien Lin, Chih-Jen Lin, and Ruby C. Weng. A note on Platt’s probabilisticoutputs for support vector machines. Machine Learning, 68:267–276, 2007. URL http://www.csie.ntu.edu.tw/~cjlin/papers/plattprob.pdf.[24] Chih-Jen Lin. On the convergence of multiplicative update algorithms for non-negativematrix factorization. IEEE Transactions on Neural Networks, 18(6):1589–1596, 2007.URL http://www.csie.ntu.edu.tw/~cjlin/papers/multconv.pdf.[25] Chih-Jen Lin. Projected gradient methods for non-negative matrix factorization. NeuralComputation, 19:2756–2779, 2007. URL Welcome to Chih-Jen Lin's Home Pagepapers/pgradnmf.pdf.[26] Tzu-Kuo Huang, Ruby C. Weng, and Chih-Jen Lin. Generalized Bradley-Terry modelsand multi-class probability estimates. Journal of Machine Learning Research, 7:85–115,2006. URL http://www.csie.ntu.edu.tw/~cjlin/papers/generalBT.pdf.[27] Pai-Hsuen Chen, Rong-En Fan, and Chih-Jen Lin. A study on SMO-type decompositionmethods for support vector machines. IEEE Transactions on Neural Networks, 17:893–908, July 2006. URL http://www.csie.ntu.edu.tw/~cjlin/papers/generalSMO.pdf.[28] Rong-En Fan, Pai-Hsuen Chen, and Chih-Jen Lin. Working set selection using secondorder information for training SVM. Journal of Machine Learning Research, 6:1889–1918,2005. URL http://www.csie.ntu.edu.tw/~cjlin/papers/quadworkset.pdf.[29] Ming-Wei Chang and Chih-Jen Lin. Leave-one-out bounds for support vector regressionmodel selection. Neural Computation, 17(5):1188–1222, 2005.[30] Pai-Hsuen Chen, Chih-Jen Lin, and Bernhard Sch¨olkopf. A tutorial on ν-support vectormachines. Applied Stochastic Models in Business and Industry, 21:111–136, 2005. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/nusvmtoturial.pdf.[31] Ting-Fan Wu, Chih-Jen Lin, and Ruby C. Weng. Probability estimates for multi-classclassification by pairwise coupling. Journal of Machine Learning Research, 5:975–1005,2004. URL http://www.csie.ntu.edu.tw/~cjlin/papers/svmprob/svmprob.pdf.[32] Bo-Juen Chen, Ming-Wei Chang, and Chih-Jen Lin. Load forecasting using supportvector machines: A study on EUNITE competition 2001. IEEE Transactions on PowerSystems, 19(4):1821–1830, November 2004.[33] Wei-Chun Kao, Kai-Min Chung, Chia-Liang Sun, and Chih-Jen Lin. Decompositionmethods for linear support vector machines. Neural Computation, 16(8):1689–1704, 2004.URL http://www.csie.ntu.edu.tw/~cjlin/papers/linear.pdf.[34] Ming-Wei Chang, Chih-Jen Lin, and Ruby C. Weng. Analysis of switching dynamicswith competing support vector machines. IEEE Transactions on Neural Networks, 15(3):720–727, 2004.[35] Chin-Sheng Yu, Chih-Jen Lin, and Jen-Kang Hwang. Predicting subcellular localizationof proteins for Gram-negative bacteria by support vector machines based on n-peptidecompositions. Protein Science, 13:1402–1406, 2004.[36] Kai-Min Chung, Wei-Chun Kao, Chia-Liang Sun, Li-Lun Wang, and Chih-Jen Lin. Radiusmargin bounds for support vector machines with the RBF kernel. Neural Computation,15:2643–2681, 2003.[37] S. Sathiya Keerthi and Chih-Jen Lin. Asymptotic behaviors of support vector machineswith Gaussian kernel. Neural Computation, 15(7):1667–1689, 2003.[38] Kuan-Min Lin and Chih-Jen Lin. A study on reduced support vector machines. IEEETransactions on Neural Networks, 14(6):1449–1559, 2003. URL http://www.csie.ntu.http://edu.tw/~cjlin/papers/rsvmTEX.pdf.[39] Chin-Sheng Yu, Jung-Ying Wang, Jinn-Moon Yang, Ping-Chiang Lyu, Chih-Jen Lin,and Jen-Kang Hwang. Fine-grained protein fold assignment by support vector machinesusing generalize npeptide coding schemes and jury voting from multiple-parameter sets.Proteins, 50:531–536, 2003.[40] Chih-Jen Lin. A formal analysis of stopping criteria of decomposition methods for supportvector machines. IEEE Transactions on Neural Networks, 13(5):1045–1052, 2002. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/stop.ps.gz.[41] Chih-Jen Lin. Asymptotic convergence of an SMO algorithm without any assumptions.IEEE Transactions on Neural Networks, 13(1):248–250, 2002. URL http://www.csie.http://ntu.edu.tw/~cjlin/papers/q2conv.pdf.[42] Chih-Chung Chang and Chih-Jen Lin. Training ν-support vector regression: Theory andalgorithms. Neural Computation, 14(8):1959–1977, 2002.[43] Shuo-Peng Liao, Hsuan-Tien Lin, and Chih-Jen Lin. A note on the decomposition methodsfor support vector regression. Neural Computation, 14:1267–1281, 2002.[44] Chih-Wei Hsu and Chih-Jen Lin. A comparison of methods for multi-class support vectormachines. IEEE Transactions on Neural Networks, 13(2):415–425, 2002.[45] Chih-Wei Hsu and Chih-Jen Lin. A simple decomposition method for support vectormachines. Machine Learning, 46:291–314, 2002.[46] Chih-Jen Lin. On the convergence of the decomposition method for support vectormachines. IEEE Transactions on Neural Networks, 12(6):1288–1298, 2001. URL http://www.csie.ntu.edu.tw/~cjlin/papers/conv.ps.gz.[47] Jinn-Moon Yang, Jorng-Tzong Horng, Chih-Jen Lin, and Cheng-Yan Kao. Optical coatingdesign using the family competition evolutionaty algorithm. Evolutionary Computation,9(4):421–444, 2001.[48] Chih-Chung Chang and Chih-Jen Lin. Training ν-support vector classifiers: Theory andalgorithms. Neural Computation, 13(9):2119–2147, 2001.[49] Chih-Jen Lin. Formulations of support vector machines: a note from an optimizationpoint of view. Neural Computation, 13(2):307–317, 2001.[50] Shu-Cherng Fang, Chih-Jen Lin, and Soon-Yi Wu. Solving quadratic semi-infinite programmingproblems by using relaxed cutting plane scheme. Journal of Computationaland Applied Mathematics, 129:89–104, 2001.[51] Soon-Yi Wu, Shu-Cherng Fang, and Chih-Jen Lin. Solving the general capacity problem.Annals of Operations Research, 103:193–211, 2001.[52] Chih-Chung Chang, Chih-Wei Hsu, and Chih-Jen Lin. The analysis of decompositionmethods for support vector machines. IEEE Transactions on Neural Networks, 11(4):1003–1008, 2000.[53] Chih-Jen Lin and Romesh Saigal. An incomplete Cholesky factorization for dense matrices.BIT, 40:536–558, 2000.[54] Chih-Jen Lin and Jorge J. Mor´e. Newton’s method for large-scale bound constrainedproblems. SIAM Journal on Optimization, 9:1100–1127, 1999.[55] Chih-Jen Lin and Jorge J. Mor´e. Incomplete Cholesky factorizations with limited memory.SIAM J. Sci. Comput., 21:24–45, 1999.[56] Shu-Cherng Fang, Soon-Yi Wu, and Chih-Jen Lin. Relaxed cutting plane method forsolving linear semi-infinite programming problems. Journal of Optimization Theory andApplications, 99:759–779, 1998.[57] Chih-Jen Lin, Soon-Yi Wu, and Shu-Cherng Fang. An unconstrained convex programmingapproach for solving linear semi-infinite programming problems. SIAM Journal onOptimization, 8(2), 1998.[58] Chih-Jen Lin, Soon-Yi Wu, and Shu-Cherng Fang. On the parametric linear semi–infiniteoptimization. Applied Mathematics Letter, 9:89–96, 1996.[59] Chih-Jen Lin, E. K. Yang, Shu-Cherng Fang, and Soon-Yi Wu. Implementation of aninexact approach to solving linear semi-infinite programming problems. Journal of Computationaland Applied Mathematics, 61:87–103, 1995.[60] Shu-Cherng Fang, Chih-Jen Lin, and Soon-Yi Wu. On solving convex quadratic semiinfiniteprogramming problems. Optimization, 37:107–125, 1994.• REFEREED CONFERENCE PAPERSSome papers here are preliminary versions of journal papers.[1] Yuchin Juan, Yong Zhuang, Wei-Sheng Chin, and Chih-Jen Lin. Field-aware factorizationmachines for CTR prediction. In Proceedings of the ACM Recommender SystemsConference (RecSys), 2016.[2] Wei-Lin Chiang, Mu-Chu Lee, and Chih-Jen Lin. Parallel dual coordinate descent methodfor large-scale linear classification in multi-core environments. In Proceedings of the22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining(KDD), 2016. URL http://www.csie.ntu.edu.tw/~cjlin/papers/multicore_cddual.pdf.[3] Hsin-Yuan Huang and Chih-Jen Lin. Linear and kernel classification: When to use which?In Proceedings of SIAM International Conference on Data Mining (SDM), 2016. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/kernel-check/kcheck.pdf.[4] Mu-Chu Lee, Wei-Lin Chiang, and Chih-Jen Lin. Fast matrix-vector multiplications forlarge-scale logistic regression on shared-memory systems. In Proceedings of the IEEEInternational Conference on Data Mining (ICDM), 2015. URL http://www.csie.ntu.http://edu.tw/~cjlin/papers/multicore_liblinear_icdm.pdf.[5] Bo-Yu Chu, Chia-Hua Ho, Cheng-Hao Tsai, Chieh-Yen Lin, and Chih-Jen Lin. Warmstart for parameter selection of linear classifiers. In Proceedings of the 21th ACM SIGKDDInternational Conference on Knowledge Discovery and Data Mining (KDD), 2015. URLhttp://www.csie.ntu.edu.tw/~cjlin/libsvmtools/warm-start/warm-start.pdf.[6] Wei-Sheng Chin, Yong Zhuang, Yu-Chin Juan, and Chih-Jen Lin. A learning-rate schedulefor stochastic gradient methods to matrix factorization. In Proceedings of the PacificAsiaConference on Knowledge Discovery and Data Mining (PAKDD), 2015. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/libmf/mf_adaptive_pakdd.pdf.[7] Yong Zhuang, Wei-Sheng Chin, Yu-Chin Juan, and Chih-Jen Lin. Distributed Newtonmethod for regularized logistic regression. In Proceedings of the Pacific-Asia Conferenceon Knowledge Discovery and Data Mining (PAKDD), 2015.[8] Chieh-Yen Lin, Cheng-Hao Tsai, Ching-Pei Lee, and Chih-Jen Lin. Large-scale logisticregression and linear support vector machines using Spark. In Proceedings of the IEEEInternational Conference on Big Data, pages 519–528, 2014. URL http://www.csie.http://ntu.edu.tw/~cjlin/papers/spark-liblinear/spark-liblinear.pdf.[9] Meng-Chieh Yu, Tong Yu, Shao-Chen Wang, Chih-Jen Lin, and Edward Y. Chang. Bigdata small footprint: The design of a low-power classifier for detecting transportationmodes. Proceedings of the VLDB Endowment, 7:1429–1440, 2014.[10] Cheng-Hao Tsai, Chieh-Yen Lin, and Chih-Jen Lin. Incremental and decremental trainingfor linear classification. In Proceedings of the 20th ACM SIGKDD InternationalConference on Knowledge Discovery and Data Mining, 2014. URL http://www.csie.http://ntu.edu.tw/~cjlin/papers/ws/inc-dec.pdf.[11] Tzu-Ming Kuo, Ching-Pei Lee, and Chih-Jen Lin. Large-scale kernel rankSVM. InProceedings of SIAM International Conference on Data Mining, 2014. URL http://http://www.csie.ntu.edu.tw/~cjlin/papers/ranksvm/kernel.pdf.[12] Yong Zhuang, Wei-Sheng Chin, Yu-Chin Juan, and Chih-Jen Lin. A fast parallel SGD formatrix factorization in shared memory systems. In Proceedings of the ACM RecommenderSystems, 2013. URL http://www.csie.ntu.edu.tw/~cjlin/papers/libmf.pdf.[13] Raffay Hamid, Dennis Decoste, and Chih-Jen Lin. Dense non-rigid point-matching usingrandom projections. In The IEEE Conference on Computer Vision and Pattern Recognition(CVPR), 2013.[14] Aditya Khosla, Raffay Hamid, Chih-Jen Lin, and Neel Sundaresan. Large-scale videosummarization using web-image priors. In Proceedings of the IEEE Conference on ComputerVision and Pattern Recognition (CVPR), 2013.[15] Guo-Xun Yuan, Chia-Hua Ho, and Chih-Jen Lin. An improved GLMNET for l1-regularized logistic regression. In Proceedings of the Seventeenth ACM SIGKDD InternationalConference on Knowledge Discovery and Data Mining, pages 33–41, 2011.[16] Chia-Hua Ho, Ming-Hen Tsai, and Chih-Jen Lin. Active learning and experimentaldesign with SVMs. In JMLR Workshop and Conference Proceedings: Workshop on ActiveLearning and Experimental Design, volume 16, pages 71–84, 2011. URL http://www.http://csie.ntu.edu.tw/~cjlin/papers/activelearning/activelearning.pdf.[17] Hsiang-Fu Yu, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. Large linear classifi-cation when data cannot fit in memory. In Proceedings of the Sixteenth ACM SIGKDDInternational Conference on Knowledge Discovery and Data Mining, pages 833–842, 2010.URL http://www.csie.ntu.edu.tw/~cjlin/papers/kdd_disk_decomposition.pdf.[18] Fang-Lan Huang, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. Iterative scalingand coordinate descent methods for maximum entropy. In Proceedings of the 47th AnnualMeeting of the Association of Computational Linguistics (ACL), 2009. Short paper.[19] Yin-Wen Chang and Chih-Jen Lin. Feature ranking using linear SVM. In JMLR Workshopand Conference Proceedings: Causation and Prediction Challenge (WCCI 2008),volume 3, pages 53–64, 2008. URL Index of /~cjlin/paperscausality.pdf.[20] Yangqiu Song, Wen-Yen Chen, Hongjie Bai, Chih-Jen Lin, and Edward Y. Chang.Parallel spectral clustering. In European Conference on Machine Learning and Principlesand Practice of Knowledge Discovery in Databases (ECML/PKDD), 2008. URLhttp://www.csie.ntu.edu.tw/~cjlin/papers/ecml08.pdf.[21] S. Sathiya Keerthi, Sellamanickam Sundararajan, Kai-Wei Chang, Cho-Jui Hsieh, andChih-Jen Lin. A sequential dual method for large scale multi-class linear SVMs. In Proceedingsof the Forteenth ACM SIGKDD International Conference on Knowledge Discoveryand Data Mining, pages 408–416, 2008. URL 國立臺灣大學 資訊工程學系~cjlin/papers/sdm_kdd.pdf.[22] Cho-Jui Hsieh, Kai-Wei Chang, Chih-Jen Lin, S. Sathiya Keerthi, and SellamanickamSundararajan. A dual coordinate descent method for large-scale linear SVM. In Proceedingsof the Twenty Fifth International Conference on Machine Learning (ICML), 2008.URL http://www.csie.ntu.edu.tw/~cjlin/papers/cddual.pdf.[23] Chih-Jen Lin, Ruby C. Weng, and S. Sathiya Keerthi. Trust region Newton method forlarge-scale logistic regression. In Proceedings of the 24th International Conference onMachine Learning (ICML), 2007. Software available at 國立臺灣大學 資訊工程學系~cjlin/liblinear.[24] Tzu-Kuo Huang, Chih-Jen Lin, and Ruby C. Weng. Ranking individuals by group comparisons.In Proceedings of the Twenty Third International Conference on Machine Learning(ICML), 2006.[25] Pai-Hsuen Chen, Rong-En Fan, and Chih-Jen Lin. Training support vector machines viasmo-type decomposition methods. In Proceedings of the 16th International Conferenceon Algorithmic Learning Theory (ALT 2005), pages 45–62, 2005.[26] Tzu-Kuo Huang, Ruby C. Weng, and Chih-Jen Lin. A generalized Bradley-Terry model:From group competition to individual skill. In Advances in Neural Information ProcessingSystems 17. MIT Press, Cambridge, MA, 2005.[27] Ting-Fan Wu, Chih-Jen Lin, and Ruby C. Weng. Probability estimates for multi-classclassification by pairwise coupling. In Sebastian Thrun, Lawrence Saul, and BernhardSch¨olkopf, editors, Advances in Neural Information Processing Systems 16. MIT Press,Cambridge, MA, 2004.[28] Kai-Min Chung, Wei-Chun Kao, Tony Sun, and Chih-Jen Lin. Decomposition methodsfor linear support vector machines. In Proceedings of ICASSP 2003, pages 868–871, 2003.[29] Ming-Wei Chang, Chih-Jen Lin, and Ruby C. Weng. Adaptive deterministic annealingfor two applications: competing SVR of switching dynamics and travelling salesmanproblems. In Proceedings of ICONIP 2002, pages 920–924, 2002.[30] Kai-Min Chung, Wei-Chun Kao, Tony Sun, Li-Lun Wang, and Chih-Jen Lin. Radiusmargin bounds for support vector machines with the RBF kernel. In Proceedings ofICONIP 2002, pages 893–897, 2002.[31] Ming-Wei Chang, Chih-Jen Lin, and Ruby C. Weng. Analysis of nonstationary timeseries using support vector machines. In Seong-Whan Lee and Alessandro Verri, editors,Proceedings of SVM 2002, Lecture Notes in Computer Science 2388, pages 160–170, NewYork, NY, USA, 2002. Springer-Verlag Inc.[32] Ming-Wei Chang, Chih-Jen Lin, and Ruby C. Weng. Analysis of switching dynamicswith competing support vector machines. In Proceedings of IJCNN, pages 2387–2392,2002.[33] Chih-Chung Chang and Chih-Jen Lin. IJCNN 2001 challenge: Generalization ability andtext decoding. In Proceedings of IJCNN. IEEE, 2001.[34] Shuo-Peng Liao, Hsuan-Tien Lin, and Chih-Jen Lin. A note on the decomposition methodsfor support vector regression. In Proceedings of IJCNN, 2001.[35] Chih-Chung Chang, Chih-Wei Hsu, and Chih-Jen Lin. The analysis of decompositionmethods for support vector machines. In Workshop on Support Vector Machines, IJCAI99,1999.[36] Chih-Jen Lin, Nestor Michelena, and Romesh Saigal. Topological fixture synthesis usingsemidefinite programming. In Proceedings of the Third World Congress of Structural andMultidisciplinary Optimization (WCSMO-3), May 17-21 1999.[37] Chih-Jen Lin. Preconditioning dense linear systems from large-scale semidefinite programmingproblems. In Proceedings of the Fifth Copper Mountain conference on iterativemethods, 1998.• BOOK CHAPTERS[1] L´eon Bottou and Chih-Jen Lin. Support vector machine solvers. In L´eon Bottou, OlivierChapelle, Dennis DeCoste, and Jason Weston, editors, Large Scale Kernel Machines, pages1–28. MIT Press, Cambridge, MA., 2007. URL Welcome to Chih-Jen Lin's Home Pagepapers/bottou_lin.pdf.[2] Yi-Wei Chen and Chih-Jen Lin. Combining SVMs with various feature selection strategies.In Isabelle Guyon, Steve Gunn, Masoud Nikravesh, and Lofti Zadeh, editors, Featureextraction, foundations and applications. Springer, 2006.[3] Soon-Yi Wu, Shu-Cherng Fang, and Chih-Jen Lin. Analytic center based cutting planemethod for linear semi-infinite programming. In M. Goberna and M. Lopez, editors,Semi-infinite programming: recent advances. Kluwer, 2001.[4] Chih-Jen Lin, Shu-Cherng Fang, and Soon-Yi Wu. A dual affine scaling based algorithmfor solving linear semi-infinite programming problems. In D. Z. Du and J. Sun, editors,Advances in Optimization and Application, pages 217–234. Kluwer Academic Publishers,1994.• TECHNICAL REPORTS:[1] Chih-Wei Hsu, Chih-Chung Chang, and Chih-Jen Lin. A practical guide to support vectorclassification. Technical report, Department of Computer Science, National Taiwan University,2003. URL http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf.[2] Hsuan-Tien Lin and Chih-Jen Lin. A study on sigmoid kernels for SVM and the trainingof non-PSD kernels by SMO-type methods. Technical report, Department of ComputerScience, National Taiwan University, 2003. URL Welcome to Chih-Jen Lin's Home Pagepapers/tanh.pdf.[3] Jen-Hao Lee and Chih-Jen Lin. Automatic model selection for support vector machines.Technical report, Department of Computer Science and Information Engineering, NationalTaiwan University, 2000.[4] Chih-Jen Lin. Study in Large-Scale optimization. PhD thesis, University of Michigan, AnnArbor, Michigan, 1998.[5] Chih-Jen Lin and Romesh Saigal. A predictor corrector method for semi-definite linearprogramming. Technical report, Department of Industrial and Operations Engineering,University of Michigan, Ann Arbor, MI 48109-2117, 1995.[6] Chih-Jen Lin and Romesh Saigal. An infeasible start predictor corrector method for semi–definite linear programming. Technical report, Department of Industrial and OperationsEngineering, University of Michigan, Ann Arbor, MI 48109-2117, 1995.• SOFTWARE1. LIBSVM: an integrated software for support vector classification and regression, releasedApril 2000. (with C.-C. Chang)(LIBSVM -- A Library for Support Vector Machines)More then 900,000 downloads from April 2000 to October 2016.About 28,000 Google Scholar citations (up to October 2016).2. LIBLINEAR: a library for large linear classification, released April 2007. (with my researchgroup)(LIBLINEAR -- A Library for Large Linear Classification)More then 150,000 downloads from April 2007 to October 2016.3. BSVM: a decomposition method for large-scale support vector machines, released February2000. (with C.-W. Hsu)(BSVM)4. TRON: a bound-constrained optimization software, released in May 1999. (with J. J.Mor´e)(http://www.mcs.anl.gov/~more/tron)5. ICFS: an incomplete Cholesky factorization for sparse matrices, released August 1998.(with J. J. Mor´e)(http://www.mcs.anl.gov/~more/icf)• INVITED TALKS AND MISCELLANEOUS PRESENTATIONS1. “Matrix factorization and factorization machines for recommender systems.” Keynoteat the 4th Workshop on Large-Scale Recommender Systems, Boston, September 20162. “When and when not to use distributedl machine learning?” Keynote at InternationalWinter School on Big Data, Bilbao, Spain, February 20163. “Large-scale Linear and Kernel Classification.” Invited talk at Microsoft Research IndiaSummer School 2015 on Machine Learning, June 15, 20154. “Matrix factorization and factorization machines for recommender systems.” Invitedtalk at SDM workshop on Machine Learning Methods on Recommender Systems, May2, 2015.5. “Big-data machine learning: status and challenges.” Invited talk at China R Conference,Hangzhou, China, October 29, 20146. “Experiences and lessons in developing machine learning software.” Invited talk at IndustryTrack, ACM Conference on Information and Knowledge Management (CIKM),Shanghai, November 4, 20147. “Large-scale linear classification: status and challenges.” Invited talk at San FranciscoMachine Learning Meetup, October 30, 20148. “Big-data machine learning.” Invited speech at eBay Data Summit, Shanghai, China,October 25, 20149. “Big-data analytics: challenges and opportunities.” Kenote speech at Taiwan DataScience Conference, Taipei, August 30, 2014.10. “Distributed data classification.” Invited talk at Workshop on New Learning Frameworksand Models for Big Data, ICML, June 25, 2014.11. “Distributed data classification.” Invited talk at Workshop on Scalable Data Analytics,PAKDD, May 13, 201412. “Large-scale machine learning.” Invited talk at International Conference on Big Dataand Cloud Computing, Xiamen, China, December 29, 2013.13. “Distributed Newton methods for CTR (Click Through Rate) prediction.” Invited talk atMysore park workshop on distributed computing for machine learning and optimization,India, December 19, 2013.14. “Recent advances in large-scale linear classification.” Invited talk at Asian Conferenceon Machine Learning, November 15, 201315. “Experiences and lessons in developing machine learning and data mining software,”Invited talk at China R Conference, Shanghai, China, November 2, 201316. “Optimization and machine learning.” Plenary talk at 11th EUROPT Workshop onAdvances in Continuous Optimization, Florence, Italy, June 26, 201317. “Optimization and machine learning,” 25th Simon Stevin Lecture, K. U. Leuven Optimizationin Engineering Center, Leuven, Belgium, January 17, 2013.18. “Machine learning software: design and practical use,” invited talk at Machine LearningSummer School (MLSS), Kyoto, August 2012.19. “Experiences and lessons in developing industry-strength machine learning and datamining software,” invited talk at Industry Practice Expo of ACM KDD 2012, Beijing,August 2012.20. “Machine learning software: design and practical use,” invited talk at Machine LearningSummer School (MLSS), Santa Cruz, July 2012.21. “Large-scale machine learning in distributed environments,” tutorial at ACM InternationalConference on Multimedia Retrieval, June, 201222. “Support vector machines and kernel methods,” invited tutorial at Asian Conference onMachine Learning, Tokyo, Japan, November 8, 201023. “Support vector machines and kernel methods,” plenary talk at International Workshopon Recent Trends in Learning, Computation, and Finance, Pohang, Korea, August 30,2010.24. “Training support vector machines: status and challenges,” invited speaker at ICML2008 Workshop on Large Scale Learning Challenge.25. “Training support vector machines: status and challenges,” invited speaker at GoogleMachine Learning Summit, May 2008.26. “Support vector machines,” invited tutorial speaker at Machine Learning Summer School(MLSS), Taipei, July 2006.27. “Training linear and non-linear SVMs,” invited talk at Workshop on Mathematics andMedical Diagnosis, Erice, Italy, July 2006.28. “Support vector machines for data classification,” invited tutorial at ICONIP 2005, Taiwan,October 30, 2005.29. “Optimization issues in training support vector machines,” the 16th international conferenceon Algorithmic Learning Theory, Singapore, October 9, 2005 (invited talk).30. “Support vector machines for data classification,” XXXVI Annual Conference of the ItalianOperational Research Society, Camerino, Italy, September 8, 2005 (invited plenarytalk).31. “Generalized Bradley-Terry model and multi-class probability estimates,” ISI (InternationalStatistical Institute) 2005, Australia, April 6, 2005 (talk in an invited session).32. “Report on NIPS 2003 Feature Selection Competition,” NIPS workshop on feature selectioncompetition, Canada, December 12, 2003.33. “Optimization techniques for data mining and machine learning,” invited talk in Workshopon Optimization and Control, National Cheng Kung University, Tainan, Taiwan,January 6, 2003.34. “Support vector machines for time series segmentation,” invited talk in the 2002 TaipeiInternational Statistical Symposium and Bernoulli Society EAPR Conference, Taipei,July 7-10, 2002.35. “Support vector machines for protein classification/prediction,” invited talk at the 8thSymposium on Recent Advances in Biophysics, Taipei, May 23, 2002.36. “Automatic model selection using the decomposition methods,” NIPS workshop on kernelmethods, Breckenridge, CO, December 1, 2000.37. “Newton’s method for support vector machines.” Talk at the Sixth SIAM Conferenceon Optimization, Atlanta, May 1999.38. “Structural optimization and semidefinite programming,” Talk at INFORMS Fall meeting,Seattle, October 1998.39. “Preconditioning dense linear systems from large-scale semidefinite programming problems,”Talk at the Fifth Copper Mountain Conference on Iterative Methods, CopperMountain, Colorado, April, 1998.40. “Incomplete Cholesky factorizations with limited memory.” Talk at the Fourth KalamazooSymposium on Matrix Analysis & Applications, Kalamazoo, MI, October, 1997.41. “Newton’s method for large bound-constrained optimization problems.” Talk at InternationalSymposium on Mathematical Programming, Lausanne, Switzerland , August,1997.42. “An unconstrained convex programming approach for solving linear semi-infinite programmingproblems.” Talk at International Symposium on Mathematical Programming,Lausanne, Switzerland , August, 1997.43. “An infeasible start predictor corrector method for semidefinite linear programming .”Talk at Fifth SIAM Optimization Conference, Victoria, British Columbia, Canada, May1996.• ACADEMIC SERVICES1. Editorial services– Action editor, Data Mining and Knowledge Discovery, 2009–– Editorial board member, ACM Transactions on Intelligent Systems and Technology,2012–– Associate editor, IEEE Transactions on Neural Networks, 2005–2010– Associate editor, Journal of Information Science and Engineering, 2009–2013– Guest editor: special issue on Support Vector Machines, Neurocomputing, 2003.2. Reviewer for the following journals– Journal of Machine Learning Research– Machine Learning– Neural Computation– SIAM Journal on Matrix Analysis and Applications– SIAM Journal on Optimization– IEEE Transactions on Neural Networks– IEEE Transactions on Pattern Analysis and Machine Intelligence– IEEE Transactions on Knowledge and Data Engineering– IEEE Transactions on Big Data– IEEE Transactions on Fuzzy Systems– IEEE Transactions on Image Processing– IEEE Transactions on Signal Processing– IEEE Signal Processing Letters– IEEE Transactions on Evolutionary Computation– IEEE Transactions on Systems, Man, and Cybernetics– IEEE Transactions on Semiconductor Manufacturing– IEEE Transactions on Antennas and Propagation– IEEE Transactions on Automation Science and Engineering– IEEE Transactions on Audio, Speech and Language Processing– Biometrika– Neurocomputing– Bioinformatics– BMC Bioinformatics– Theory of Computing Systems– Neural Processing Letters– Signal Processing– International Journal of Pattern Recognition and Artificial Intelligence (IJPRAI)– Artificial Intelligence Review– Pattern Analysis & Applications– Computational Intelligence and Neuroscience– IIE Transactions– Annals of the Institute of Statistical Mathematics– Journal of Statistical Planning and Inference– Statistics and Computing– Communications in Statistics– Pattern Recognition– Pattern Recognition Letters– Knowledge and Information Systems– Computational Optimization and its Applications– INFORMS Journal on Computing– Journal of Global Optimization– Optimization– Numerical Algorithms– Information Processing and Management– Internet Electronic Journal of Molecular Design– International Journal of Operations and Quantitative Management– International Journal of Computer Mathematics– Journal of Information Science and Engineering– Journal of Computer Science and Technology (JCST)– Journal of Formosan Medical Association– Journal of Chinese Institute of Industrial Engineers– Journal of the Chinese Institute of Engineers– Journal of the Chinese Institute of Electrical Engineering3. Reviewer for several book chapters4. Conference chair, Area chair, or senior program committee member:– General co-chair, Mysore park workshop on distributed computing for machine learningand optimization, India, 2013– Senior PC, ACM SIGKDD international conference on Knowledge discovery anddata mining (KDD), 2013, 2014, 2015, 2016– Senior PC, SIAM international conference on data mining (SDM), 2017– Area chair, Neural Information Processing Systems (NIPS) 2007, 2010, 2011, 2015– Area chair, International Conference on Machine Learning (ICML), 2016, 2017– General co-chair, Asian Conference on Machine Learning (ACML) 2011– Senior PC, Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD),2013, 2014, 2015, 2016– Senior PC, AAAI 2017– Senior PC, IJCAI 2011 (IEAI track)– Senior PC, Asian Conference on Machine Learning (ACML) 20105. Program committee member:– AAAI 2016– International Joint Conference on Artificial Intelligence (IJCAI) , 2015– Workshop on Large-Scale Recommender Systems at ACM RecSys, 2014– The International Workshop on advances in Regularization, Optimization, Kernelmethods and Support vector machines: theory and applications (ROKS-2013), Belgium.– ACM SIGKDD international conference on Knowledge discovery and data mining(KDD), Washington D.C. 2010, San Diego, 2011, Beijing, 2012– SIAM international conference on data mining (SDM), 2013– AI & Statistics 2010, 2014– NIPS Workshop on Optimization for Machine Learning (2008, 2009, 2010, 2011,2012, 2013)– NIPS Workshop on AutoML (2014)– International Conference on Machine Learning (ICML), Helsinki 2008, Montreal2009, Haifa 2010, Bellevue, WA 2011, Scotland, 2012, Atlanta, 2013, Beijing, 2014– European Conference on Machine Learning (ECML) and European Conference onPrinciples and Practice of Knowledge Discovery in Databases (PKDD), 2008, 2010,– International Joint Conference on Neural Networks (IJCNN), Hong Kong 2008, SanJose, CA 2011– Pacific-Rim Conference on Multimedia (PCM), Hong Kong 2007, Bangkok, Thailand2009, Shanghai, China 2010.– IEEE International Conference on Multimedia & Expo (ICME), Beijing 2007, Hannover2008.– Asian Conference on Machine Learning (ACML), 2009– NIPS 2006 Workshop on Machine Learning Open Source Software.– ACM Multimedia Conference (ACM MM), Santa Barbara 2006– International Colloquium on Grammatical Inference (ICGI), Japan 2006– International workshops on Statistical Techniques in Pattern Recognition (SPR),Hong Kong 2006, Orlando, Florida, 2008, Turkey, 2010.– Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD), Singapore2006, China 2007, Osaka, Japan 2008, Thailand 2009– International Conference on Neural Information Processing (ICONIP), India 2004,Hong Kong 2006– International Workshop on Pattern Recognition with Support Vector Machines (SVM2002),Canada– Fourth Asia-Pacific Conference on Industrial Engineering and Management Systems,2002, Taiwan6. Reviewer for the following conferences– Neural Information Processing Systems (NIPS), 2003, 2004, 2005, 2006, 2014, 2016– Conference on Learning Theory (COLT), 2003, 2009– International Joint Conference on Neural Networks (IJCNN), 2003, 2004, 2005– IEEE International Conference on Multimedia & Expo (ICME), 2009– First Asia-Pacific Bioinformatics Conference, Australia, 2003– The Seventh Pacific Rim International Conference on Artificial Intelligence, (PRICAI-02)7. Other conference planning and administration– Special session organizer, ICONIP 2002, Singapore8. Thesis External Reviewers:– University of Trento, Italy: Nicola Segata (Ph.D. 2009)– Jin Yu: Australian National University (Ph.D. 2009)– Ruhr-Universit¨at Bochum: Tobias Glasmachers (Ph.D. 2008)– Hongkong University of Science and Techonology: Ivor Tsang (Ph.D. 2007)– National University of Singapore: Chu Wei (Ph.D. 2003), Kaibo Duan (Ph.D. 2003),Jianbo Yang (Ph.D., 2011)– Nanyang Technological University, Mingkui Tan (Ph.D. 2014)– Chinese University of Hongkong: Wan Zhang (M. Phil. 2003)9. Proposal Reviewers:– Research Grants Council, Hong Kong, 2006, 2007, 2008, 2009, 2010– American University of Beirut, 2009– Czech Science Foundation, 201010. Other Services:– IEEE CS society fellow evaluation committee member (2011, 2012)• TALKS IN ACADEMIC INSTITUTES AND INDUSTRY– International:1. Microsoft, Redmond, Washington, October 6, 20162. Guangdong University of Technology, Guangzhou, China, June 20, 20163. Samsung Research America, California, June 10, 20164. UC Davis, California, May 4, 20165. Netflix, California, May 3, 20166. Huawei Research Labs, Shenzhen, China, April 19, 20167. Chinese University of Hong Kong, Shenzhen, China, April 18, 20168. Facebook, California, November 13, 20159. Quora, California, November 12, 201510. Nanjing University, December 25, 201411. University of Electronic Science and Technology of China, Chengdu, China, November30 and December 1, 2014 (two talks)12. Twitter, California, October 31, 201413. eBay China, October 24, 201414. Microsoft Research, New York City, August 22, 2014 (open machine learning softwareworkshop)15. Microsoft, Redmond, Washington, August 18, 201416. Criteo, California, August 1, 201417. Databricks, California, July 31, 201418. Research Center on Fictitious Economy and Data Science, Chinese Academy ofSciences, Beijing, China, June 27, 201419. Institute of Computational Mathematics and Scientific/Engineering Computing, ChineseAcademy of Sciences, Beijing, China, June 26, 201420. Samsung Research America, California, May 23, 201421. Walmart Labs, California, April 23, 201422. Pandora, California, April 22, 201423. Alibaba, Hangzhou, China, December 27, 201324. eBay China, November 1, 201325. Shanghai Jiao Tong University, October 31, 201326. Microsoft Research, Redmond, August 15, 201327. University of Rome “La Sapienza,” June 25, 201328. K. U. Leuven, Belgium, January 14-16, 201329. Baidu, China, October 24, 201230. Luminescent technology, California, August 24, 201231. eBay Machine Learning Forum, San Jose, California, February 17, 201232. City University of Hong Kong, December 30, 201133. NEC Labs, Cupertino, California, August 26, 201134. Adobe, California, August 25, 201135. eBay research, San Jose, California, December 7, 201036. Facebook, Palo Alto, California, December 6, 201037. Baidu, China, September 3, 201038. Google Research New York, July 29, 201039. Yahoo! Research, Santa Clara, California, July 23, 201040. China Agriculture University, October 16, 200941. Microsoft Research Asia, October 13, 200942. Department of Computer Science and Engineering, Hong Kong University of Scienceand Technology, February 5, 200943. Department of Computer Science and Technology, Tsinghua University, China, September5, 200844. HP Labs China, June 26, 200845. IBM T. J. Watson Research Center, May 16, 200846. Department of Industrial and Operations Engineering, University of Michigan, August15, 200747. Yahoo! Research, Santa Clara, California, February 20, 200748. NEC Labs, Princeton, New Jersey, February 15, 200749. Siemens Corporate Research, Princeton, New Jersey, February 14, 200750. AT&T Research, February 13, 200751. California Institute of Technology, November 14, 200652. School of Information and Computer Science, University of California, Irvine, November6, 200653. Yahoo! Research, Burbank, California, August 30, 200654. Mathematics and computer science division, Argonne National Lab., June 23, 200655. Chinese University of Hong Kong, Hone Kong, December 12, 200556. Nanyang Technological University, Singapore, October 10, 200557. Universit`a di Roma ”La Sapienza” and Isituto di Analisi dei Sistemi ed Informaticadel CNR, Italy, September 1-2, 2005 (a short course).58. CWI (Dutch National Research Institute for Mathematics and Computer Science),February 9, 200459. Department of Electronics and Computer Science, University of Southampton, February2-6, 2004 (two talks)60. Department of Computer Science, University of Essex, January 22, 200461. Department of Statistics and Probability Theory, Vienna University of Technology,September 4, 200362. Fraunhofer Institute for Computer Architecture and Software Technology, Germany,August 18, 200363. Department of Computer Science, University of Essex, August 13, 200364. University of Freiburg, Germany, July 15, 200365. Max Planck Institute of Biological Cybernetics, Germany, July 9, 200366. University of Tuebingen, Germany, July 8, 200367. KXEN Corporation, Suresnes, France, February 17, 200368. Max Planck Institute of Informatics (Computer Science), Germany, February 10-16,2003 (two talks)69. Max Planck Institute of Biological Cybernetics, Germany, January 12-February 10,2003 (three talks)70. Department of Electrical and Computer Engineering, University of Michigan-Dearborn,August 27, 200271. Siemens Corporate Research, Princeton, New Jersey, August 21, 200272. Department of Computer Science, Binghamton University, August 19, 200273. Merck research Lab., New Jersey, August 16, 200274. Agilent Inc., Colorado, July 31, 200175. Ford Research Lab., Michigan, July 24, 200176. Department of Electrical Engineering, Ohio State University, August 29, 2000.– Domestic:1. Institute of Statistics, National Tsing Hua University, April 29, 20162. Industrial Technology Research Institute, October 7, 8, 21, and 22, 2015 (a shortcourse on data mining)3. Industrial Technology Research Institute, July 18, 22, 24, and August 12, 2014 (ashort course on data mining)4. Interdisplinary Science Program, National Chiao Tung University, March 28, 20145. Institute of Biomedical Electronics and Bioinformatics, National Taiwan University,September 24, 20126. Department of Mathematics, National Taiwan University, October 17, 20117. Department of Financial and Computational Mathematics, Providence University,September 22, 20118. Department of Mathematics, National Taiwan Normal University, April 20, 20119. Department of Applied Informatics, Fo Guang University, April 14, 201110. Department of Information Management, National Taiwan University, February 25,11. Institute of Information Science, Academia Sinica, February 16, 201112. Department Computer Science and Information Engineering, Chaoyang Universityof Technology, October 29, 201013. Graduate Institute of Communication Engineering, National Taiwan University,September 27, 201014. Department Computer Science and Information Engineering, National Central University,November 12, 200815. Department of Information Management, Chaoyang University of Technology, October30, 200716. Department Computer Science and Information Engineering, National Cheng-KungUniversity, October 26, 200717. Department of Computer Science, National Chengchi University, November 10, 200518. Department of Computer Science, National Chi-Nan University, September 24, 200419. Institute of Information Science, Academia Sinica, April 15, 200420. Department of Statistics, National Chiao Tung University, April 9, 200421. Computer and Communications Research Laboratories, Industrial Technology ResearchInstitute, February 27 and March 3, 2004 (8 hours)22. Computer and Communications Research Laboratories, Industrial Technology ResearchInstitute, November 18, 200323. Department of Information Management, Chaoyang University of Technology, November4, 200324. Graduate Institute of Industrial Engineering, National Taiwan University, April 23,25. Department Mathematics, National Taiwan University, March 10, 200326. Department of Information and Computer Engineering, Chung Yuan Christian University,December 16, 200227. Department of Statistics, Feng Chia University, November 1, 200228. Department of Statistics, National Chengche University, October 14, 200229. Asian BioInnovations Corporation, Taipei, June 14, 200230. Graduate Program in Bioinformatics, National Yang Ming University, March 29,31. Department of Information Science and Management, Providence University, March22, 200232. Department of Computer Science and Information Engineering, National TaiwanUniversity of Science and Technology, March 11, 200233. Institute of Statistical Science, Academia Sinica, January 16, 200234. Department Mathematics, National Taiwan University, January 5, 200235. Institute of Computer Science and Information Engineering, Chang Gung University,December 4, 2001.36. Graduate Institute of Medical Informatics, Taipei Medical University, November 22,2001.37. Department of Information Management, National Taichung Institute of Technology,October 23, 2001.38. Graduate Institute of Industrial Engineering, National Taiwan University, October3, 200139. Department of Information Management, National Taiwan University of Science andTechnology, September 27, 200140. Department of Biological Science and Technology, National Chiao Tung University,September 26, 200141. Institute of Information Science, Academia Sinica, August 28-29, 200142. Department Computer Science and Information Engineering, National Cheng-KungUniversity, May 25, 200143. Department of Information and Computer Education, National Taiwan Normal University,April 9, 200144. Institute of Statistical Science, Academia Sinica, February 19, 200145. Department Computer Science and Information Engineering, National Central University,January 17, 200146. Division of Biostatistics and Bioinformatics, National Health Research Institutes,December 6, 200047. Institute of Biochemistry, National Yang-Ming University, June 5, 200048. Department Computer Science and Information Engineering, National Chung ChengUniversity, May 22, 200049. Institute of Information Science, Academia Sinica, November 19, 199950. Department of Computer Science, National Tsing Hua University, June 2, 199951. Department Computer Science and Information Engineering, National Taiwan University,March 5, 199952. Department of Industrial Engineering, National Tsing Hua University, December 24,53. Department Computer Science and Information Engineering, National Taiwan University,December 26, 199754. Department Mathematics, National Cheng-Kung University, December 19, 199755. Institute of Information Management, National Chi-Nan University, December 18,56. Department of Industrial Engineering, National Tsing Hua University, December 17,57. Department Mathematics, National Cheng-Kung University, May, 1997• TEACHING EXPERIENCE1. Operations Research (Fall 1998, Fall 1999, Fall 2000)2. Scientific computing (Winter 1999, Winter 2000)3. Numerical methods (Winter 2001, Winter 2002, Winter 2003, Winter 2009, Winter 2010,Winter 2011, Winter 2013, Winter 2014, Winter 2016)4. Statistical learning theory (Fall 1999, Fall 2000, Fall 2001, Fall 2002, Fall 2003, Fall 2004,Fall 2005)5. Data mining and machine learning (Fall 2001, Fall 2002, Winter 2004, Winter 2005,Winter 2006, Winter 2007)6. Introduction to the theory of computation (Fall 2003, Fall 2004, Fall 2005, Fall 2007,Fall 2008, Fall 2009, Fall 2010, Fall 2011, Fall 2012, Fall 2013, Fall 2014, Fall 2015)7. Machine learning: theory and practice (Winter 2007, Winter 2010, Winter 2013)8. Optimization and machine learning (Fall 2010, Fall 2011, Winter 2014, Fall 2015)• MEMBERSHIPS: IEEE (fellow), ACM (fellow), AAAI (fellow)That’s all folks !! Hope you had a nice trip. :)

People Trust Us

easy to navigate & fill in the forms that I use for our business

Justin Miller