[Hardware][TPU][V1] Multi-LoRA implementation for the V1 TPU backend (#14238)
Signed-off-by: Akshat Tripathi <akshat@krai.ai> Signed-off-by: Chengji Yao <chengjiyao@google.com> Co-authored-by: Chengji Yao <chengjiyao@google.com>
This commit is contained in:
@@ -47,7 +47,7 @@ def dist_init():
|
||||
temp_file = tempfile.mkstemp()[1]
|
||||
|
||||
backend = "nccl"
|
||||
if current_platform.is_cpu():
|
||||
if current_platform.is_cpu() or current_platform.is_tpu():
|
||||
backend = "gloo"
|
||||
|
||||
init_distributed_environment(world_size=1,
|
||||
|
||||
Reference in New Issue
Block a user