[NLPL Task Force (A)] rolling your own BERT (and maybe ELMo) on Saga

Stephan Oepen oe at ifi.uio.no
Fri Jan 31 17:53:40 UTC 2020


belatedly, thanks for the link, antti!

> Here's a (quick and dirty) repo for the code we used to train FinBERT: https://github.com/haamis/DeepLearningExamples_FinBERT/tree/master/TensorFlow/LanguageModeling/BERT_nonscaling. This one has the sbatch files used: https://github.com/haamis/BERT-pretraining

the README claims that V100 gpus are required, whereas Saga only has
P100 cards.  so i just gave up on the prospect of getting this to run
in norway and rather requested an allocation of billing units on Puhti
:-).  i think for your presentation next week, you can just assume
that Saga is not a relevant target system for this work.

looking forward to the tutorial!  oe




More information about the infrastructure mailing list