your current location is:Home > TechnologyHomeTechnology

The father of Tesla's autopilot left to teach "online courses", AI bosses are so fond of teaching and solving puzzles

  • luc
  • 2022-08-26 14:58:39
  • 152 read
He created Stanford's first deep learning class, and now his "lecture addiction" has committed again.  A few...
He created Stanford's first deep learning class, and now his "lecture addiction" has committed again.He created Stanford's first deep learning class, and now his "lecture addiction" has committed again.

  A few years ago, Andrej Karpathy, a doctoral student at Stanford University, did something that the AI community admired very much: he started an undergraduate deep learning course in his school and made all the videos freely available to the public.

  As one of the early researchers and educators of deep learning, Karpathy bet on the right direction. After more than ten years of rapid development and technological innovation, deep learning technology has now driven many key fields and industries such as search, image recognition, social networking, industrial automation, and autonomous driving.

  Due to his outstanding research achievements, Karpathy himself has become a beneficiary of the development of applied AI technology. His first job out of academia and into the industry was when Tesla's Tesla served as the director of the AI department, leading Autopilot and the entire Tesla Auto/ The research and development of assisted driving technology projects, and led the company to go deep into more cutting-edge fields such as robots.

  Last month, Karpathy announced his official departure from Tesla.

  When people were still paying attention to where he was going next, he "returned to his old business" and became a "teacher" unsurprisingly.

  /put the "hammer" in everyone's hands/

  Just last week, Karpathy reactivated the YouTube "trumpet" that had been registered for years but never used, and released a 2.5-hour instructional video titled "Hands-on with Neural Networks and Backpropagation." : Build micrograd".

  He said that this video is by far the lowest and most comprehensive explanation about backpropagation (one of the basic algorithms of neural networks) and the work of building neural networks. He also claims that as long as the audience has a basic understanding of Python and remembers a little high school-level calculus, it's easy to get started:

  "If you don't understand the core of backpropagation and neural networks after reading it, then I will live stream eating shoes.

  "

Image credit: Andrej KarpathyImage credit: Andrej Karpathy

  Through this course, Karpathy uses basic Python programming methods and the Micrograd engine developed by himself to complete basic operations such as neural network construction, writing loss functions, manual tuning, etc., and in the process of back propagation and other key Knowledge is introduced systematically.

  As for the teaching style, Karpathy also understands the troubles of contemporary young engineers when writing code, and implements the concept of "talkischeap, showmethecode": most of the time in the whole class is basically live broadcast of his own screen, teaching everyone how to write code... …

Image credit: Andrej Karpathy/YouTubeImage credit: Andrej Karpathy/YouTube

  Karpathy studied at the University of Toronto and Stanford University, and his teachers Geoff Hinton and Li Feifei are leaders in deep learning today. He is also one of the founding members of OpenAI, and has interned in well-known research institutions such as Google Brain, Google Research, and DeepMind in the early days.

  As a well-known scholar and practitioner in the field of deep learning, Karpathy has appeared quite frequently, often publishing papers and speeches at well-known academic conferences such as CVPR and NeurIPS, and has also served as a keynote speaker at the NVIDIA Geforce Technology Conference.

  It is quite funny that in 2014, he used his "flesh and blood" to challenge a convolutional neural network in the ImageNet challenge and won, so he was dubbed by academia and industry. ImageNet's "human benchmark" (thehumanbenchmarkofImageNet).

Image credit: Andrej KarpathyImage credit: Andrej Karpathy

  However, many people have the deepest impression of Karpathy, and they are most grateful to him for a great deed he did during his Ph.D.:

  Founded CS231n, Stanford's first deep learning course for undergraduates.

Image credit: Andrej Karpathy, Stanford UniversityImage credit: Andrej Karpathy, Stanford University

  CS231n takes computer vision as the main direction, and discusses deep learning technology in simple terms. In the first year of the course, a total of 150 students signed up, which doubled in 2016 and doubled again the following year. This course has now become one of the most enrolled and most popular courses in all departments on Stanford campus. There are 16 TAs in the course; Li Feifei, who co-founded the course with Karpathy, is still serving as the keynote speaker.

  More importantly, from the second year of the course creation, Karpathy has released all CS231n teaching videos, handouts, assignments, notes and other related resources on the Internet. Not only the registered students, even those who did not register, students from other colleges, or even anyone in the world, as long as they have an internet connection, they can take classes for free.

  Going forward five years, with the establishment of companies such as Udacity and Coursera, the concept of "MOOC" has long been widely known. There are also many courses in computer science and machine learning on the platforms of these well-known companies.

  However, the advent of CS231n, especially its more advanced and real-time course design, as well as its open attributes without any commercial flavor, make it a key node for the popularization of cutting-edge deep learning technology to the whole society.

Image credit: Andrej Karpathy, Stanford UniversityImage credit: Andrej Karpathy, Stanford University

  Regarding why the course should be opened for free, Karpathy once said that he felt very strongly at the time that deep learning would become a technology of great revolutionary significance, which is expected to be widely used in all aspects of society, just like a good hammer.

  But in those days, many people couldn't even "buy" a hammer, let alone understand its capabilities and master its usage - it was he who decided to stand up and be the one who gave everyone the hammer for free.

  The most interesting experience of this course is that it does not have a pre-determined course setting that will never deviate from it:

  “In other subjects, it may be 19th-century knowledge. In our class, often today’s papers are published last week, or even just yesterday,” Karpathy said. “We are not talking about nuclear physics, not In space, you only need basic calculus and algebra knowledge to understand our courses, understand and master new things that are happening at the moment. Every time the course changes, the experience is very different, but everyone enjoys it very much Such an experience."

  Since Karpathy was still a doctoral student when he started the course, being a teacher really took up a lot of his time and energy. He teaches twice a year, four months each time, and he has to use 120% of his energy when he is in class, and even the research work related to the doctoral degree has to stop.

  "Nevertheless, I still think this class was the highlight of my Ph.D.

Andrej Karpathy Image source myselfAndrej Karpathy Image source myself

  If you briefly glance over Karpathy's resume, the average person may not think of him as a teaching scholar. But in fact, he is really keen on preaching, teaching and solving puzzles, especially sharing his own learning results and unique experience of mastering skills.

  He once wrote some experience talks as a doctoral student, and published them on Stanford's official website and personal podcast. For example, for undergraduates preparing for the exam, he sincerely warned everyone that "staying up late is not worth it", "seeking more and changing TA", "self-study in the early stage of test preparation, and more communication before the test", etc. For students who are struggling with whether or not to pursue a Ph.D., he has written a 10,000-character article on "Doctoral Survival Guide", in which he shared a lot of guiding ideology and specific details from the aspects of preliminary preparation, mentor selection, research topics, publications, academic speeches, etc. experience……

Image credit: Andrej KarpathyImage credit: Andrej Karpathy

  If you think he only talks about deep learning, then you look down on others too much:

  For non-professional content, he will also use his spare time to do long-term, systematic research and testing, and then write articles. For example, he once wrote an article on " biohacking " on his GitHub account , sharing his testing experience in physical exercise, fasting, blood testing, physical fitness drugs, sleep research, etc.

  Less known is that Karpathy is also a professional Rubik's Cube teacher...

  He has an alter  go on solving the Rubik's Cube called  Badme phisto. He has a website dedicated to teaching Rubik's Cube knowledge, he has made an app (iPhone & Android) that teaches you how to solve Rubik's Cube, and he has uploaded a lot of teaching videos on YouTube, with a total of more than 9 million views...

  Dude is really addicted to class...

  Karpathy got a Google Glass during his internship at Google and recorded this video of solving the Rubik's Cube on a bicycle. Motion picture source: AndrejKarpathy

  /AI bosses love to give lectures /

  It is worth noting that not only Karpathy, but also the big shots in the field of deep learning/AI, all love "lecturing".

  This is indeed a bit of nonsense literature... Because many of these bigwigs have regular or tenured teaching positions in well-known colleges and universities, and some people who have entered the industry have a strong academic and teaching background before. But they are still very enthusiastic about the dissemination and popularization of deep learning knowledge, even if they are now too busy in high positions in the industry.

  For example, Karpathy's teacher at the University of Toronto, Professor Geoff Hinton. He is the inventor of deep learning core algorithms such as backpropagation, one of the "Three Musketeers" in the field of deep learning, and the winner of the Turing Award. There is such an evaluation of him: when others were still questioning machine/deep learning, Hinton silently taught in Toronto while continuing to advance research, and finally resurrected machine learning research and industry by himself. Called the father of deep learning.

  Since the DNN Research company he founded was acquired by Google in 2013, Hinton has had a position at Google, but in fact his main research work is still in the Department of Computer Science at the University of Toronto. Although he has stopped teaching regular courses in recent years, he still spends a lot of time and energy mentoring students in research and participating in essay writing. In addition, he also participated in the recording of some free online courses on deep learning neural networks.

Screenshot source: CourseraScreenshot source: Coursera

  The other two members of the "Three Musketeers", Yoshua Benjio and Yann LeCun, also have their own teaching positions.

  Bengio, a professor at the University of Montreal, also founded the Mila Institute for Artificial Intelligence and serves as scientific director. He still leads and mentors a large number of graduate students, doctoral students and postdoctoral fellows at the school and the Mila Institute to this day. He often participates in lectures and lectures from various external institutions, and his subsequent videos will also be published online.

  LeCun serves as vice president and chief AI scientist at Meta, and is also a professor in the Department of Electrical and Computer Engineering at NYU. He once sincerely stated that he may not be able to check and reply to the school's mailbox and telephone messages in time. However, at least last year he was teaching deep learning-related courses at New York University's Center for Data Science (NYU-CDS). In addition, due to the epidemic and other reasons, all the deep learning courses taught by LeCun in NYU-CDS will be fully online and open for free in 2020:

  Looking back at home, Zhang Tong, the former director of Tencent AILab who left to teach at the school a few years ago, is also an interesting example. He is a well-known international expert in the field of machine learning. He once held tenure-track positions in American universities, held high-level research and management positions in IBM Research, Yahoo Research, and Baidu IDL. He is also a series of top international leaders such as ICML and NIPS. The chair or field chair of an academic conference.

  However, perhaps because he enjoys the environment of teaching and academia more, Zhang Tong left Tencent AILab in early 2019 and came to the Department of Mathematics and Computer Science Engineering of Hong Kong University of Science and Technology. Chair professor (chairprofessor) qualification.

  Of course, he did not completely leave the industry. He built a bridge between Hong Kong University of Science and Technology and the Innovation Workshop, and helped the two institutions set up a joint laboratory to focus on basic scientific research. Now, he is doing research and leading students at HKUST. This semester, he also taught two courses related to machine learning optimization, COMP6211E and MATH6450J.

  These top scholars who love preaching, teaching and solving puzzles show us that even in today's highly commercialized world, a technology and a kind of knowledge can still be achieved through the classical form of education, coupled with the innovation of "free online courses" means to achieve a more efficient and more public welfare popularization.

  Such big guys, please come more, don't stop.


TAG: No label

Article Comments (0)

    • This article has not received comments yet, hurry up and grab the first frame~


Top