Skip to content

pytorch mini batch size

2019.12.19 06:12

WHRIA 조회 수:93

https://stackoverflow.com/questions/52518324/how-to-compensate-if-i-cant-do-a-large-batch-size-in-neural-network/52523847

 

 

4

In pytorch, when you perform the backward step (calling loss.backward() or similar) the gradients are accumulated in-place. This means that if you call loss.backward() multiple times, the previously calculated gradients are not replaced, but in stead the new gradients get added on to the previous ones. That is why, when using pytorch, it is usually necessary to explicitly zero the gradients between minibatches (by calling optimiser.zero_grad() or similar).

If your batch size is limited, you can simulate a larger batch size by breaking a large batch up into smaller pieces, and only calling optimiser.step() to update the model parameters after all the pieces have been processed.

For example, suppose you are only able to do batches of size 64, but you wish to simulate a batch size of 128. If the original training loop looks like:

optimiser.zero_grad()
loss = model(batch_data) # batch_data is a batch of size 128
loss.backward()
optimiser.step()

then you could change this to:

optimiser.zero_grad()

smaller_batches = batch_data[:64], batch_data[64:128]
for batch in smaller_batches:
    loss = model(batch) / 2
    loss.backward()

optimiser.step()

and the updates to the model parameters would be the same in each case (apart maybe from some small numerical error). Note that you have to rescale the loss to make the update the same.

번호 제목 글쓴이 날짜 조회 수
229 Re: ♩♪♬하하하..퀴즈!♡~답공개~ schauberger 2001.10.06 2135
228 허락도 없이... juice 2001.10.05 2084
227 Re: ♩♪♬하하하..오랜만에....퀴즈!♡~ 한승석 2001.09.30 2077
226 우와 대단하다. 주미진 2001.09.28 2074
225 후회... coral 2001.09.28 2083
224 ♩♪♬하하하..오랜만에....퀴즈!♡~ schauberger 2001.09.19 2084
223 승석이형 안녕하세요? 송영도 2001.09.14 2076
222 Re: 글이 없었던 이유는 김호정 2001.08.30 2072
221 잘 지내? 혜진누나 2001.08.28 2248
220 글이 없었던 이유는 한승석 2001.08.24 2073
219 그냥 그런... noname 2001.08.24 2103
218 테스트 승석 2001.08.05 2077
217 야호~~~*^^* soma 2001.07.31 2087
216 에궁.. 최평균 2001.07.15 2082
215 ^^ 저두........ schauberger 2001.06.22 2083

Powered by Xpress Engine / Designed by Sketchbook

sketchbook5, 스케치북5

sketchbook5, 스케치북5

나눔글꼴 설치 안내


이 PC에는 나눔글꼴이 설치되어 있지 않습니다.

이 사이트를 나눔글꼴로 보기 위해서는
나눔글꼴을 설치해야 합니다.

설치 취소