본문 바로가기
Artificial Intelligence/Deep Learning

coursera_week2_5_Computation Graph

by raphael3 2017. 10. 4.
반응형
뉴럴넷의 결과를 계산해내는 단계인 순전파의 단계에 이어, 미분(derivatives)이나 기울기(gradients)를 계산해내는 역전파 단계에 의해 뉴럴넷의 computation이 수행된다. computation graph를 설명하기 위해, 단순한 예제를 사용하자. 
Let's say that we're trying to compute a function, J, which is a function of three variables a, b, and c and let's say that function is 3(a+bc. Computing this function actually has three distinct steps. The first is you need to compute what is bc and let's say we store that in the variable call u. So u=bc and then you might compute V=a+u. So let's say this is V;. And then finally, your output J is 3V. So this is your final function J that you're trying to compute;. We can take three steps and draw them in a computation graph as follows. let's draw three variables a, b, and c here. So the first thing we did was compute u=bc;. And so the input to that are b and c. And then, you might have V=a+u. So the inputs to that are V. So the inputs to that are u with just computed together with a. And then finally, we have J=3V;. 

as a concrete example, if a=5, b=3 and c=2 then u=bc would be six because a+u would be 5+6 is 11, J is three times that, so J=33;. 

The computation graph comes in handy when there is some distinguished or some special output variable, such as J in this case, that you want to optimize.(별도의 J와 같은 최적화가 필요한 대상을 변수화 시켜서 관찰하니 computation graph가 이제 비로소 실제로 편리하고 쓸모있어 보인다.) And what we're seeing in this little example is that, through a left-to-right pass, you can compute the value of J. And what we'll see in the next couple of slides is that in order to compute derivatives, there will be a right-to-left pass like this;, kind of going in the opposite direction as the blue arrows. That would be most natural for computing the derivatives. 
반응형