. , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . , . ,

$ sudowheel

Linux articles, guides, and deep dives

sudowheel ~ ai-ml-ops
$ nvidia-smi --query-gpu=name,memory.total --format=csv,noheader
NVIDIA GeForce RTX 4090, 24564 MiB
$ python -c "import torch; print(torch.cuda.is_available())"
True
$ ls ./ai-ml-ops/
ai-workloads-linux.md gpu-monitoring.md cuda-containers.md model-serving.md
Latest Articles