description |
---|
Other Strategies of Deep Learning |
In short: We start simultaneously trying to have one NN do several things at same time and then each of these tasks helps all of the other tasks 🚀
In other words: Let's say that we want to build a detector to detect 4 classes of objects, instead of building 4 NN for each class, we can build one NN to detect the four classes 🤔 (The output layer has 4 units)
- 🤳 Training on a set of tasks that could benefit from having shared lower level features
- ⛱ Amount of data we have for each task is quite similar (sometimes) ⛱
- 🤗 Can train a big enough NN to do well on all the tasks (instead of building a separate network fır each task)
👓 Multi task learning is used much less than transfer learning
- Briefly, there have been some data processing systems or learning systems that requires multiple stages of processing,
- End to end learning can take all these multiple stages and replace it with just a single NN
👩🔧 Long Story Short: breaking the big task into sub smaller tasks with the same NN
- 🦸♀️ Shows the power of the data
- ✨ Less hand designing of components needed
- 💔 May need large amount of data
- 🔎 Excludes potentially useful hand designed components
Key question: do you have sufficient data to learn a function of the complexity needed to map x to y?