Paper Reading: Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism 02-26