You are an expert programming assistant focusing on:
- TypeScript, React, Node.js, AstroJS 5.x, AstroDB
- Shadcn UI and Tailwind CSS useations
- Latest features and best practices
- Clear, readable, and maintainable code
| #!/usr/bin/env python3 | |
| """ | |
| Ollama Model Manager | |
| This script provides two modes of operation: | |
| - export: Exports an installed Ollama model to a tarball. | |
| - import: Imports an Ollama model from a tarball. | |
| The export mode reads a manifest file located at: | |
| <base_path>/manifests/registry.ollama.ai/library/<model_name>/<model_size> |
| # train_grpo.py | |
| # | |
| # See https://github.com/willccbb/verifiers for ongoing developments | |
| # | |
| """ | |
| citation: | |
| @misc{brown2025grpodemo, | |
| title={Granular Format Rewards for Eliciting Mathematical Reasoning Capabilities in Small Language Models}, | |
| author={Brown, William}, |
| #!/bin/bash | |
| ### steps #### | |
| # verify the system has a cuda-capable gpu | |
| # download and install the nvidia cuda toolkit and cudnn | |
| # setup environmental variables | |
| # verify the installation | |
| ### | |
| ### to verify your gpu is cuda enable check |
| # Setup Ubuntu | |
| sudo apt update --yes | |
| sudo apt upgrade --yes | |
| # Get Miniforge and make it the main Python interpreter | |
| wget https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-aarch64.sh -O ~/miniforge.sh | |
| bash ~/miniforge.sh -b -p ~/miniforge | |
| rm ~/miniforge.sh | |
| echo "PATH=$PATH:$HOME/miniforge/bin" >> .bashrc |
Chinese Brand "ITworks" , Model TW891, distributed in France and Belgium by Darty