cancel
Showing results for 
Search instead for 
Did you mean: 
Tengfei
Cadet
Cadet
  • 640 Views

LLM training on openshift

LLM training involves a great number of GPUs, how to integrate, manage and allocate the GPU resources on openshift? Any one has any experience, please share here to discuss......

Labels (1)
2 Replies
Chetan_Tiwary_
Moderator
Moderator
  • 638 Views

Hello @Tengfei !

Could you please elaborate a little about this issue ? What is this LLM openshift training ? 

0 Kudos
Tengfei
Cadet
Cadet
  • 634 Views

It is AI large Language Model training ... which is the most popular research area on AI. It requires a great scale of GPU memory and GPU calculation resources. So how could we manage these resource to meet the LLM training calculation resource requirement and improve the GPU efficiency within and between AI calculation servers? 

0 Kudos
Join the discussion
You must log in to join this conversation.