This guide explains how to customize the lock/login screen in Kali Linux XFCE (LightDM).
| # Version: 0.1 (2025-01-18) | |
| # License: MIT, use at your own risk | |
| # | |
| # This script disables the Lenovo-installed "Tobii experience" software and "nahimic" software. | |
| # Tested on a Lenovo Legion Pro 5 (82WM) with Windows 11 24H2. | |
| # Run it with `powershell.exe -noprofile -executionPolicy Bypass -File badlenovo.ps1` | |
| # Following this script, you should be able to uninstall the "Tobii experience" app from the control panel (appwiz.cpl) | |
| # | |
| # After major updates, you may need to re-run this script. |
-
First we need to have qemu installed on the system, for most of Linux systems, we can install “qemu-utils” for Windows we can download QEMU disk image utility from here
-
Other related questions
How to open raw disk from VMware
How to open qcow2 disk from VMware
How to open vdi disk from VMware
How to open raw disk from Hyper-V
How to open qcow2 disk from Hyper-V
How to open vdi disk from Hyper-V
| # Add this snippet to your $PROFILE to make Bash's autocompletion available to PowerShell on Linux. | |
| # Warning: adds ~500ms initialization time. | |
| # References: https://brbsix.github.io/2015/11/29/accessing-tab-completion-programmatically-in-bash/ | |
| # https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/register-argumentcompleter | |
| # Find all commands | |
| $commands = bash -c 'source /usr/share/bash-completion/bash_completion && complete' | awk '{ print $NF }' | |
| $commands += ls /usr/share/bash-completion/completions | |
| $commands | ForEach-Object { |
| using System; | |
| using System.Collections.Generic; | |
| using System.Collections.ObjectModel; | |
| using System.Diagnostics.CodeAnalysis; | |
| using System.IO; | |
| using System.Management.Automation; | |
| using System.Security; | |
| using System.Text; | |
| using Microsoft.Win32; |
| import torch | |
| import torch.nn as nn | |
| import torch.optim as optim | |
| import gym | |
| import numpy as np | |
| import pickle | |
| # Hyperparameters | |
| H = 200 # Number of hidden layer neurons | |
| batch_size = 10 # Every how many episodes to do a param update? |
| Category | Settings Page | URI Command | |
|---|---|---|---|
| Accounts | Access work or school | ms-settings:workplace | |
| Accounts | Email & app accounts | ms-settings:emailandaccounts | |
| Accounts | Family & other people | ms-settings:otherusers | |
| Accounts | Set up a kiosk | ms-settings:assignedaccess | |
| Accounts | Sign-in options | ms-settings:signinoptions | |
| Accounts | Sync your settings | ms-settings:sync | |
| Accounts | Windows Hello setup | ms-settings:signinoptions-launchfaceenrollment | |
| Accounts | Your info | ms-settings:yourinfo | |
| Apps | Apps & Features | ms-settings:appsfeatures |
Good question! I am collecting human data on how quantization affects outputs. See here for more information: ggml-org/llama.cpp#5962
In the meantime, use the largest that fully fits in your GPU. If you can comfortably fit Q4_K_S, try using a model with more parameters.
See the wiki upstream: https://github.com/ggerganov/llama.cpp/wiki/Feature-matrix
These are NOT product / license keys that are valid for Windows activation.
These keys only select the edition of Windows to install during setup, but they do not activate or license the installation.
I was wondering why it took so long for my deep learning rig to fully boot up. It literally takes 5 minutes to go from reboot to ssh service start. Unlike desktop motherboards, the ROMED8-2T has two 10G RDMA ethernet controllers and a port for the IPMI interface.
For a while I thought I must have done something wrong - but it turns out when there are free, unmapped ports it will block boot. To fix this, you just need to make the other interfaces optional in your /etc/netplan/*.yaml
network:
ethernets:
enoXnpX:
addresses:
- ...