playlist stringclasses 160
values | file_name stringlengths 9 102 | content stringlengths 29 329k |
|---|---|---|
Robotics_1_Prof_De_Luca | Robotics_1_Prof_De_Luca_Lecture_21_14_Nov_2014.txt | okay so uh we can now start looking from more distance on how a robotic system work we have detailed at least kinematics and inward kinematics we will leave to the differential analysis the next lecture but now we want to verify how see how robots are programmed and on top of that what is the general structure of a con... |
Robotics_1_Prof_De_Luca | Robotics_1_Prof_De_Luca_Lecture_09_17_Oct_2014.txt | now we will start seeing Some mathematics connected through the description of geometry and motion of robot manipulators since we have already mentioned that's are over 20 believe there will be a chain of region bodies connected by John's we will start today with a over viewing the way in with the ways in which we can ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_101_Recognition_Image_Classification.txt | hello and welcome to computer vision lecture number 10. in the past lectures we have focused our attention a lot on the geometric aspects of computer vision now in this lecture we're going to touch on another important subfield of computer vision which is recognition recognizing objects and it's a very data-driven fiel... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_83_ShapefromX_ShapefromX.txt | this is a short unit in which we briefly review the shape extraction techniques that we have learned about so far and also highlight some of the ones that we didn't have time to look into in-depth before we then move on to the fusion unit number four we have seen how to do binocular stereo matching by utilizing the ap ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_52_Probabilistic_Graphical_Models_Markov_Random_Fields.txt | let's start with markov random fields or in short mrfs but before we do that let's briefly recap probability theory in probability theory we're talking about random variables which are variables over which we define distributions so there's two types of random variables discrete random variables that we are considered ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_84_ShapefromX_Volumetric_Fusion.txt | we have seen many techniques to reconstruct geometry from image inputs but most of these reconstructions only considered a particular object or even just produced a single depth map and often when we want to do reconstruction we want to reconstruct entire scenes we want to reconstruct larger scenes scenes that are comp... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_24_Image_Formation_Image_Sensing_Pipeline.txt | in this last very short unit we're gonna discuss the rest of the image sensing pipeline which is what happens with the light after it has passed through the lens and arrived at the image plane until it is stored in a digital format such that we can use it for computer vision applications for instance here is a simple s... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_91_Coordinatebased_Networks_Implicit_Neural_Representations.txt | hi and welcome to computer vision lecture number nine at the university of tubingen this is a lecture that i have been looking forward to for a while as it is about a topic coordinate based neural networks or implicit shape representations that my group has worked a lot on and in fact we have proposed one of the first ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_121_Diverse_Topics_in_Computer_Vision_Input_Optimization.txt | hi and welcome to lecture number 12 which is the last lecture of this computer vision class and today i'm particularly excited because we just got to know that we received the best lecture award for this computer vision class from the computer science department and so i'm even more excited to tell you about the things... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_13_Introduction_History_of_Computer_Vision.txt | this unit is about a brief overview over the history of computer vision now of course it's a very shortened summary of what has happened over the last 50 years and it's a personalized viewpoint but with this being said i hope it still can provide you some intuition about on where the field has come from and where the f... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_62_Applications_of_Graphical_Models_MultiView_Reconstruction.txt | in this unit we're going to see a second application of graphical models in the context of dense multiview stereo reconstruction which is the problem of reconstructing a 3d scene not from just two images but from multiple images of that scene and the advantage of course similar to in the case of sparse reconstruction t... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_22_Image_Formation_Geometric_Image_Formation.txt | in the second unit we are going to discuss the geometric image formation process and basic camera models and we're going to see how points in 3d space are related to points on the 2d image plane the principle of projection um is an old one and as an example we can take for instance the human eye where light is passing ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_44_Stereo_Reconstruction_Spatial_Regularization.txt | in the previous units we always applied the winner takes all strategy which is for every pixel in the reference image we compute the similarity across all disparity hypotheses and then compute return the patch with the highest similarity or with the lowest matching cost while using deep learning to compute the similari... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_12_Introduction_Introduction.txt | in this second unit we're gonna introduce what computer vision is and also have a look at some of the challenges involved in this problem to put computer vision into context um let's consider it as part of what it is typically considered an aspect of the general field of artificial intelligence artificial intelligence ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_94_Coordinatebased_Networks_Generative_Radiance_Fields.txt | in this last unit we're going to take the representation from the previous unit the neural radiance field and try to build a generative model for it try to build a model that is capable of given just a collection of 2d image of a particular object category builds a general 3d model that can be rendered photorealistical... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_51_Probabilistic_Graphical_Models_Structured_Prediction.txt | hello and welcome to lecture number five of this computer vision course over the last few weeks or so i tried to improve a little bit of my audio setup i've purchased some new components and try to improve the acoustics and this is the result so i hope you enjoy the now new and hopefully improved audio quality of this ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_73_Learning_in_Graphical_Models_Deep_Structured_Models.txt | so far we discussed log linear models log linear models are models where the parameters w appear in a log linear fashion in the model equation in the probability distribution this assumption severely limits these type of models because the features must already be very powerful they must already do the heavy lifting th... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_63_Applications_of_Graphical_Models_Optical_Flow.txt | in the last unit of this lecture we're going to discuss the optical flow problem and how it can be formulated or thought of as inference in a markov random field optical flow is defined as the apparent motion of objects surfaces and edges in a visual scene caused by the relative motion between an observer and a scene s... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_81_ShapefromX_ShapefromShading.txt | hello and welcome to computer vision lecture number eight today's topic is shape from x which means reconstructing 3d geometry from images in the previous lectures we've already learned about two techniques one was binocular stereo reconstructing from two images using stereo matching techniques finding correspondences ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_72_Learning_in_Graphical_Models_Parameter_Estimation.txt | how can we now estimate the parameters of a conditional random field our goal is to maximize the likelihood of the outputs y conditioned on the inputs x with respect to the parameter vector w and as often the case in machine learning we're going to assume that the data is independent and identically distributed in orde... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_23_Image_Formation_Photometric_Image_Formation.txt | this unit is about the photometric image formation process so far we have discussed how individual light rays travel through space but it's also important how actually the light changes once it reaches the sensor plane so now we discuss how an image is formed in terms of pixel intensities and colors if we take a pictur... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_53_Probabilistic_Graphical_Models_Factor_Graphs.txt | while inference can be done directly on markov random fields we're now gonna discuss a graphical model that's a little bit more precise than simple markov random fields or markov networks and this is called a factor graph why do we need factor graphs let's consider mrfs again consider the following factorization into p... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_45_Stereo_Reconstruction_EndtoEnd_Learning.txt | in the last unit we're gonna be talking about end-to-end learning algorithms that not only learn the matching cost but directly take entire images as input and using a deep neural network directly output disparity maps and this was something that was really only possible with increasing compute in particular annotated ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_32_StructurefromMotion_Twoframe_StructurefromMotion.txt | in this unit we're going to discuss two frame structure from motion which is the most basic setup where we're given just two images taken from different viewpoints and we want to find the relative motion between the two cameras as well as the location of the 3d points that correspond to the projections that we observe ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_42_Stereo_Reconstruction_Block_Matching.txt | now we know how we obtain depth values from disparity values but how given such a rectified stereo setup can we actually determine the disparity values in the first place well of course given the left and the right image we somehow have to find how much each pixel in the reference image in the image that we want to com... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_71_Learning_in_Graphical_Models_Conditional_Random_Fields.txt | welcome to computer vision lecture number seven this is the last lecture in this little excursion on graphical models in lecture number five we've introduced graphical models and a very basic inference algorithm the belief propagation algorithm for inferring marginal or maximum upper story solutions in lecture number s... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_21_Image_Formation_Primitives_and_Transformations.txt | hello and welcome to the second lecture of this computer vision course if we want to build computer vision systems we first have to gain a good understanding of the image formation process of how a 3d scene is projected onto a 2d image and once we have an understanding of this process we can start developing models tha... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_111_SelfSupervised_Learning_Preliminaries.txt | hey and welcome to computer vision lecture number 11 on self-supervised learning this lecture is divided into four units in the first unit we are going to discuss some preliminaries and basic motivation for self-supervised learning in unit number two we're gonna look at the first type of self-civilized learning models ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_11_Introduction_Organization.txt | welcome to computer vision a lecture at the University of tubingen my name is Andreas Geiger and I am excited to be your lecturer for this course what is computer vision well simply speaking computer vision is the attempt to replicate the phenomenal perceptual capabilities of humans in a machine in other words we're co... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_43_Stereo_Reconstruction_Siamese_Networks.txt | while in the previous unit we talked about very simple handcrafted similarity metrics this unit is about learning similarity metrics from data using the power of deep neural networks the motivation for this is that hand crafted features and similarity matrix do not take into account all relevant geometric and radiometr... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_112_SelfSupervised_Learning_Taskspecific_Models.txt | task specific models these are self-supervised models that are specific to a particular target task downstream task as opposed to the models that we're going to consider in the later units that are trying to learn generic representations independent of any downstream task the first case that we're going to consider is ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_61_Applications_of_Graphical_Models_Stereo_Reconstruction.txt | welcome to lecture number six of this computer vision class in the last lecture we've introduced motivated by the ambiguities in the stereo matching problem we've introduced graphical models and a particular inference algorithm called belief propagation that allows us to compute maximum episterity solutions or map solu... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_114_SelfSupervised_Learning_Contrastive_Learning.txt | the last unit is about contrastive learning what is the problem with the pre-text tasks that we have discussed so far well in this pre-text task for pre-training the large chunk of parameters in the neural network we're considering a task that at first glance is completely decoupled from from the downstream task for ex... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_82_ShapefromX_Photometric_Stereo.txt | we saw how these ambiguities arising in the shape from shading problem can be partially addressed at least using strong smoothness assumptions now in this unit we're gonna see that these ambiguities can also be addressed using simply more observations and that's called the photometric stereo problem instead of using mo... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_102_Recognition_Semantic_Segmentation.txt | now let's move on to semantic segmentation there have been a few attempts to semantic segmentation also before the deep learning era but the performance of these approaches is nowhere near comparable to the performance of deep methods so we're going to focus on deep methods in this unit let's remind ourselves again wha... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_123_Diverse_Topics_in_Computer_Vision_Human_Body_Models.txt | this unit is on human body models humans interact through their bodies as illustrated here on the slide which is a slide i took from michael black who is a max planck director here in tubingen and who has dedicated a lot of his research to understanding and modeling human bodies and modeling and understanding human bod... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_92_Coordinatebased_Networks_Differentiable_Volumetric_Rendering.txt | we have seen these implicit representations as useful output representations for geometry appearance and motion but what we have done so far is always assumed full 3d supervision in order to train this model we always had to have a set of 3d points for which we knew if they are inside or outside the shape however in pr... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_31_StructurefromMotion_Preliminaries.txt | hello and welcome to lecture number three of this computer vision course last time we discussed the image formation process how a 3d point gets projected onto the 2d sensor plane now in this lecture in the structure from motion lecture we're going to utilize that knowledge in order to build 3d models from multiple imag... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_55_Probabilistic_Graphical_Models_Examples.txt | and finally in this unit i'd like to show you some very simple examples of this algorithm in practice and also guiding towards the exercise tasks while then in the next lecture we're going to consider more realistic examples with real problems such as stereo matching so the first example i want to show you is this vehi... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_103_Recognition_Object_Detection_and_Segmentation.txt | let's now move to object detection and some other related tasks that can be solved for similar algorithms such as object instance segmentation or object mesh prediction let's remind ourselves again what object detection is about the goal is to localize and classify all objects in the image we're not interested in the s... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_122_Diverse_Topics_in_Computer_Vision_Compositional_Models.txt | another research area that i'd like to show you is on compositional models it's about trying to learn about the world in terms of its compositional structures in order to build more robust models and models that are more useful and more semantically interpretable and there's many aspects of this research questions and ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_93_Coordinatebased_Networks_Neural_Radiance_Fields.txt | maybe one of the most popular follow-up works on this idea of representing shape and appearance implicitly is called nerf neural radiance fields the idea or the application of nerf is actually a little bit different from what we have considered it is a novel view synthesis which is the task of given sparsely sampled im... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_33_StructurefromMotion_Factorization.txt | so far we discussed how to [Music] obtain epipolar geometry from two views but of course it would be ideal to obtain both the camera pose and the 3d reconstruction directly by optimizing over many views of the scene in order to minimize the camera pose and 3d reconstruction error and this is what we're going to cover n... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_113_SelfSupervised_Learning_Pretext_Tasks.txt | pretext tasks actually both this unit and the last unit of this lecture cover pretext tasks but the last category of pretext tasks that we discovered the so-called contrastive tasks have developed into their own field that's why i separated these two units but both are effectively pre-text tasks what is a pretext task ... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_54_Probabilistic_Graphical_Models_Belief_Propagation.txt | in this unit we're going to see how we can do inference in graphical models how we can answer questions such as what is the marginal distribution of a certain subset of variables or what is the so-called maximum upper story solution what is the most likely configuration of random variables under that probabilistic mode... |
Computer_Vision_Andreas_Geiger | Computer_Vision_Lecture_41_Stereo_Reconstruction_Preliminaries.txt | hi and welcome to the fourth lecture of this computer vision course last time we discussed sparse 3d reconstruction techniques how to based on sparse correspondences or features reconstruct 3d structure sparsely from two or more views the topic of today is dense stereo reconstruction which is how we can obtain a more d... |
Introduction_to_Robotics | Lecture_12_Evolution_of_Robotics.txt | Welcome back. So, we will continue the introductory part of this robotics course. So, in the last class I briefly mentioned about the various applications of robots and then how people are actually using robotic in various fields. And we saw that because of the varied applications and the or all people from different w... |
Introduction_to_Robotics | Lecture_61_Principles_of_PMSM_Control.txt | we started looking at permanent magnet synchronous motors and we said that it is distinguished from the brushless dc motor by having a sinusoidal emf that will be induced in the rotor induced in the stator when the rotor begins to rotate so we have seen therefore that this will mean that you need to apply the sinuso... |
Introduction_to_Robotics | Lecture_42_Power_Electronic_Switching_and_Current_Ripple.txt | In the last class, were trying to look at how one can generate dc, which is of a different value as compared to what was there in these course. And we saw that you can generate a dc which has a different value that is some 100 volt. If you want a dc of 25 volt, one can generate that by using a switch and operating it i... |
Introduction_to_Robotics | Lecture_83_Particle_Filter.txt | Welcome to Lecture 3 in Week 10. And so, in this lecture, we are going to look at what are called non-parametric filters. So far, we have been looking at Gaussian filters for state estimation. And the Gaussian filters essentially assume that your belief distribution is of a specific functional form. In this case, it wa... |
Introduction_to_Robotics | Lecture_33_DC_Motor_Equations_and_Principles_of_Control.txt | In the discussion yesterday, we were looking at the DC motor and attempting to understand what are the basic physics behind the operation of the motor. So, this gives an elementary view of how the motor is made and the physics behind the operation is given. Now, if you see here, we had put the rotor this is a drawing, ... |
Introduction_to_Robotics | Lecture_101_Localization_Taxonomy.txt | Hello, everyone and welcome to the final week of lectures in the intro to robotics course. And as before, we will continue looking at the algorithm and computer science aspects of it. So, we have looked at, you know, what constitutes the notion of state, and then we looked at recursive state estimation, looked at motio... |
Introduction_to_Robotics | Lecture_62_Encoders_for_Speed_and_Position_Estimation.txt | In the last class we had looked at the set of machines that we had seen going from BC to BLDC to the synchronous motor with sinusoidal, and we saw what are the advantages and disadvantages of each one. So, apart from this there is one more AC machine which is the induction machine, induction motor. So, the induction mo... |
Introduction_to_Robotics | Lecture_213_Manipulator_Jacobian_and_Statics.txt | Welcome back, in the last class we discussed about the Manipulator the Jacobian and how the Jacobian can be calculated as well as some of the issues associated with the use of Jacobian like the the Singularities issues, as well as the Inverse of Jacobian. And then, we found that the singularity can be identified by loo... |
Introduction_to_Robotics | Lecture_32_Principles_of_DC_Motor_Operation.txt | So, in the last class we had looked at need for actuators and the operation requirements that one may expect. So, and we saw the different varieties of actuators, you may take for hydraulic and then electric. What are the advantages that one has with respect to another, and then we therefore we said that there are lot ... |
Introduction_to_Robotics | Lecture_73_Recursive_State_Estimation_Bayes_Filter_Illustration.txt | hello everyone and in this module uh we will start looking at exactly how the base filter algorithm will operate okay so as uh most of you would remember the base filter algorithm is going to have a prediction step right ah and followed by a measurement update or a correction update right so before we get into the w... |
Introduction_to_Robotics | Lecture_81_Kalman_Filter.txt | hello everyone welcome to the first module in week 10 so this week we are going to continue looking at recursive state estimation question right so in the last week we looked at the base filter algorithm right and so essentially uh so this is how it operated uh so we were given uh the belief in the previous state ri... |
Introduction_to_Robotics | Lecture_63_Stepper_Motors.txt | In the last class then, we were looking at sensor for speed and position. Mostly encoders are the ones that are nowadays used and these as we have seen, there are two varieties one is incremental encoder, that is what we saw in the last class, how the incremental encoder would work. And then the other variety that we h... |
Introduction_to_Robotics | Lecture_71_Introduction_to_Probabilistic_Robotics.txt | Hello everyone, and now we are going to be looking at the Computer Science module for the Introduction to Robotics course. My name is B. Ravindiran, and I am a faculty in the Computer Science Department, in IIT Madras. So, during the next few weeks, we will be looking at various issues that have to do with how robo... |
Introduction_to_Robotics | Lecture_23_Industrial_Robot_Kinematic_Structures.txt | Good morning, welcome back to the discussion on robot kinematics. So in the last class we briefly talked about the degrees of freedom. As I mentioned that in general, the degrees of freedom are the set of independent displacements that specify completely the displaced or deformed position of the body or system. But in ... |
Introduction_to_Robotics | Tutorial_2_Probability_Basics.txt | One of the important concepts in probability theory is that of the random variable. A random variable is a variable whose value is subject to variations. That is, a random variable can take on a set of possible different values, each with an associated probability. Mathematically, a random variable is a function from t... |
Introduction_to_Robotics | Lecture_212_Differential_Relations.txt | hello good morning welcome back so in the last few classes we discussed about the forward and inverse kinematics of manipulator so this forward and inverse basically is a position relationship so we were trying to see how we can relate the joint positions to the tool position or if you know the tool position how can... |
Introduction_to_Robotics | Lecture_52_Control_of_the_Brushless_DC_Motor.txt | In the last class we had started looking at Brushless DC Machine and these varieties of machines belongs to group a called permanent magnet AC Machines, which is PMAC; and this further belong to a group called synchronous motors. So, the brushless DC machines indeed the entire group, which is the synchronous motor vari... |
Introduction_to_Robotics | Lecture_210_Inverse_Kinematics.txt | Good morning, welcome back. So, we will start the discussion on Inverse Kinematics today. In the last few classes we talked about the forward kinematics, starting from the coordinate transformation matrix and how do we go to identify the DH parameters and then using DH parameters how do we get the transformation matrix... |
Introduction_to_Robotics | Lecture_51_The_Brushless_DC_Machine.txt | So, we had been looking at the DC motor operation and control and we said that the DC machine has certain difficulties due to which it is not really used much in the field now a days if you want to select a new machine, however the ability remains remains very much there, if you look at tools in your labs, because this... |
Introduction_to_Robotics | Lecture_26_DH_Algorithm.txt | Last class we discussed about the kinematic parameters of industrial robots. And we mentioned that there are 4 parameters. The first one we called as joint parameter. Joint parameters define the relative position and orientation of two successive links. And they are the joint angle and joint distance. These are the two... |
Introduction_to_Robotics | Lecture_22_Homogeneus_Transformation_Matrix.txt | Good morning. Welcome back to the discussion on Robot Kinematics. In the last class we discussed about Coordinate Transformation Matrix. That is if you have a mobile coordinate frame and a reference coordinate frame, how do we map these two coordinate frames using Coordinate Transformation Matrix was the discussion we ... |
Introduction_to_Robotics | Lecture_94_Range_Finder_Measurement_Model.txt | welcome to the final lecture of week 11 and so in this lecture we look at the last component that we need for making our estimation models work right so our state estimation models work which is essentially the measurement model so remember the measurement model tells you what the what is the probability of z given ... |
Introduction_to_Robotics | Lecture_84_Binary_Bayes.txt | so welcome back to the the fourth lecture in week 10 and we are going to continue looking at non-parametric filters right so if you remember we started looking this filters as a way of doing recursive state estimation right so what we are going to look at in this lecture is a very special case right where i really r... |
Introduction_to_Robotics | Lecture_27_DH_Algorithm.txt | So, in the last class we discussed about DH Algorithm, how do you assign the coordinate frames and once you assign the coordinate frame, how do you get the DH parameters. So, for each joint axis there will be one coordinate frame assigned including the base frame and then looking at the coordinate frames you can identi... |
Introduction_to_Robotics | Lecture_211_Inverse_Kinematics_Examples.txt | Hello, welcome back. So, we are discussing the Inverse Kinematics of Manipulators, in the last class we briefly mentioned about the method by which we can solve the inverse kinematics and we took a very simple example of a 2 degree of freedom planar manipulator to show how the equations can be solved and then we consid... |
Introduction_to_Robotics | Lecture_103_Path_Planning.txt | hello everyone so this week so far we considered the problem of mobile robot localization we looked at the localization taxonomy we also looked at the markov localization and we saw a little bit on how to do this localization given a feature based map right most of the algorithms are looking at grid based map right ... |
Introduction_to_Robotics | Lecture_28_Forward_Kinematics.txt | So, in the last few classes we discussed about the DH parameters and then saw how do we actually assign coordinate frames and then using coordinate frame how do you, how do you find out the DH parameters basically the four parameters theta, a, d and alpha. So, today we will see how do we actually use this one for devel... |
Introduction_to_Robotics | Lecture_31_Overview_of_Electric_Actuators_and_Operational_Needs.txt | I am Doctor Krishna Vasudevan from Electrical Department and we will be first looking at the subject of Electric Actuators for the purpose of robots. So, let me write that down. So, let us look at actuators and that is what we are going to talk about. Actuators are essentially elements in the system, elements in the sy... |
Introduction_to_Robotics | Lecture_93_Occupa_Grid_Mapping.txt | hello everyone and welcome to the third lecture in a week 11 in the intro to robotics course where we are going to continue looking at probabilistic robotics right so we have looked at various estimation problems and various kinds of models so far so in the last lecture we were talking about uh how the motion model ... |
Introduction_to_Robotics | Tutorial_1_Probability_Basics.txt | hello and welcome to the first tutorial in the introduction to machine learning course my name is priyatash i am one of the teaching assistants for this course in this tutorial we'll be looking at some of the basics of probability theory before we start let's discuss the objectives of this tutorial the aim here is ... |
Introduction_to_Robotics | Lecture_24_Robot_Architectures.txt | okay so in the last class we talked about various robot architectures so robot architecture basically comes from the type of joint and the way the joints are arranged and we saw that the first 3d view of freedom of the robot is used for positioning the wrist and these positioning joints decide the architecture of t... |
Introduction_to_Robotics | Lecture_82_Extended_Kalman_Filter.txt | Assumptions that we made for the Kalman Filter, so some things for you to remember is that the Kalman Filter is a bayes filter algorithm. And the crucial assumptions we made was that the state transitions and the measurement models were both linear. So the state transition was linear in the sense that xt was a linear f... |
Introduction_to_Robotics | Lecture_91_Velocity_Motion_Model.txt | Hello everyone and welcome to week 11 of the Introduction to Robotic course and so we are continuing looking at the CS aspect of the course. So far we have been looking at recursive state estimation and we talked about you know various assumptions we make on the believe model, the motion model and so on for and derived... |
Introduction_to_Robotics | Lecture_92_Odometry_Motion_Model.txt | so in the last lecture we looked at the velocity motion model and so today we are going to look at the odometry motion model right so the odometry information is essentially information on how much the robot has moved right how much the position of the robot has changed and the automatic data is typically obtained f... |
Introduction_to_Robotics | Lecture_53_The_PM_Synchronous_Motor_PMSM_and_SPWM.txt | So, what we have seen is that during the interval when any two switches are on in that interval the equivalent circuit can be ideally represented as a DC circuit. This is what we had seen here for the case of switches 1 and 6 being on what you have is the input DC source is now connected to the motor to those two phase... |
Introduction_to_Robotics | Lecture_11_Introduction.txt | Hello good morning to all of you and welcome to this course on introduction to Robotics. So, this in an introductory course offered in the department which actually covers the fundamentals of Robotics. As you know, Robotics is a very interesting field, so we thought that we should offer a course on Robotics but we want... |
Introduction_to_Robotics | Lecture_41_DC_Motor_Control_Regions_and_Principles_of_Power_Electronics.txt | In the last class, we talked more about how the DC motors actually works and when we are going to combine it with some sort of load, how do you then determine how or where the system will operate at which speed it will run? it is determined by this intersection between the speed versus, speed versus torque graph of the... |
Introduction_to_Robotics | Lecture_2_1_Kinematics_Coordinate_transformations.txt | Very good morning. Welcome back to this course on Introduction Robotics. So we will start the topic Manipulator Kinematics today. In the last few classes I mentioned about the development of robotics in general and the applications of robotic technology in various fields, and I mentioned that industrial robotics is one... |
Introduction_to_Robotics | Lecture_25_Kinematic_Parameters.txt | Hello, welcome back. In the last class we discussed about the robot architectures, the body and arm assembly configurations. And we found that there are two types of joints; the rotary and prismatic, and by arranging these rotary and prismatic joints, we will be able to get different body arm configurations. And we fou... |
Introduction_to_Robotics | Lecture_43_The_HBridge_and_DC_Motor_Control_Structure.txt | So, in the last class we looked at the circuit as we have drawn and we found that one can determine what is the magnitude of this increase that is, if you call this as delta i, then delta i is the magnitude of the ripple. And one can determine the magnitude of the ripple if you know what is the duration for which the s... |
Introduction_to_Robotics | Lecture_72_Recursive_State_Estimation_Bayes_Filter.txt | last lecture we were looking at state estimation we started talking about recursive state estimation and so we were talking about what constitutes a state right the state state at each time t could be a very complicated vector of various entities that you could record like the robot post the location in the world r... |
Introduction_to_Robotics | Lecture_29_Forward_Kinematics_Examples.txt | so yesterday we briefly talked about this problem the forward kinematics of 6x is circulated robot into latex and the first part as I mentioned we were to assign the coordinate frame so we saw that there are I mean if you see success robots so you will be able to identify l 0 to l 6 as the coordinate axis so we firs... |
Introduction_to_Robotics | Lecture_102_Markov_Localization.txt | So, in last lecture we looked at the taxonomy of the localization problems. And, this lecture we will look at a very simple, the Markov Localization algorithm. So, if you look at this, this is exactly the Bayes filter algorithm except that instead of looking at, just the state, I also have to look at the map here. So, ... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_2020_Introduction_to_Deep_Learning.txt | hi everyone, let's get started. Good afternoon and welcome to MIT 6.S191! TThis is really incredible to see the turnout this year. This is the fourth year now we're teaching this course and every single year it just seems to be getting bigger and bigger. 6.S191 is a one-week intensive boot camp on everything deep learn... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_2021_Convolutional_Neural_Networks.txt | Hi Everyone and welcome back to MIT 6.S191! Today we're going to be talking about one of my favorite topics in this course and that's how we can give machines a sense of vision now vision is one of the most important human senses I believe sighted people rely on vision quite a lot from everything from navigating in ... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_2019_Visualization_for_Machine_Learning_Google_Brain.txt | all right so thank you for the invitation it's really exciting to take part of my afternoon to come over here and talk to you all about some of the work that we're doing as Ava said I co-lead a couple of things at Google one is this team called the big picture group and a lot of our work centers around data visualizati... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_2019_Introduction_to_Deep_Learning.txt | good afternoon everyone thank you all for joining us my name is Alexandra Meany and one of the course organizers for six s-191 this is mi t--'s official course on introduction to deep learning and this is actually the third year that we're offering this course and we've got a really good one in store for you this year ... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_2021_Deep_Generative_Modeling.txt | Hi everyone and welcome to lecture 4 of MIT 6.S191! In today's lecture we're going to be talking about how we can use deep learning and neural networks to build systems that not only look for patterns in data but actually can go a step beyond this to generate brand new synthetic examples based on those learned patte... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_2023_The_Modern_Era_of_Statistics.txt | thank you hi everyone it's Alexander for the introduction all right very excited to um talk about these modern era of Statistics so you you've heard throughout the lectures probably a lot about the you know deep learning and the technologies that enables this but what I want to talk about I want to say why it works oka... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_Towards_AI_for_3D_Content_Creation.txt | great yeah thanks for a nice introduction um i'm gonna talk to you about 3d content creation and particularly deep learning techniques to facilitate 3d content creation most of the work i'm going to talk about is the work um i've been doing with my group at nvidia and the collaborators but it's going to be a little bit... |
MIT_6S191_Introduction_to_Deep_Learning | Barack_Obama_Intro_to_Deep_Learning_MIT_6S191.txt | This year I figured we could do something a little bit different and instead of me telling you how great this class is I figured we could invite someone else from outside the class to do that instead. So let's check this out first. Hi everybody and welcome MIT 6.S191, the official introductory course on deep learning t... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_2018_Deep_Learning_Limitations_and_New_Frontiers.txt | I want to bring this part of the class to an end so this is our last lecture but for our series of guest lectures and in this talk I hope to address some of the state of deep learning today and kind of bring up some of the limitations of the algorithms that you've been seeing in this class so far so we got a really goo... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_Convolutional_Neural_Networks.txt | [Music] hi everyone and welcome back to day two of 6s one91 introduction to deep learning So today we're going to be talking about a subject which is actually uh one of my favorite topics in this entire course and that's how we can give machines the sense of sight just like we all have so vision is one of the most impo... |
MIT_6S191_Introduction_to_Deep_Learning | MIT_6S191_Building_AI_Models_in_the_Wild.txt | [Music] all right so let's give a Short Round of Applause to welcome Doug and Nico and let them take it [Applause] away awesome thanks so much and just a quick sanity check people can hear in the back it's okay not too loud not too quiet two3 all works okay fantastic all right thank you so much a bit of background on u... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.