Skip to content

Instantly share code, notes, and snippets.

@kanzure
kanzure / wxpythontesting.py
Last active May 23, 2025 15:50
random wxpython testing example
import wx
import wx.adv
import time
# --------------------------
# GUI Application Definition
# --------------------------
class MyFrame(wx.Frame):
def __init__(self):
super().__init__(parent=None, title="Simple wxPython App")
@kanzure
kanzure / voice.py.diff
Created April 13, 2025 19:04
use groq whisper with aider for /voice
diff --git a/aider/voice.py b/aider/voice.py
index c9af7ae9..ae4bfba8 100644
--- a/aider/voice.py
+++ b/aider/voice.py
@@ -167,7 +167,7 @@ class Voice:
with open(filename, "rb") as fh:
try:
transcript = litellm.transcription(
- model="whisper-1", file=fh, prompt=history, language=language
+ model="groq/whisper-large-v3", file=fh, prompt=None, language=language
@kanzure
kanzure / wait_for.py
Created November 22, 2024 15:32
python asyncio.wait_for timeout parameter is for total time, not function execution time. Is there a standard library way to wait_for with a timeout that times the awaitable instead of the overall python process?
import time
import asyncio
EXPECTED_INNOCENT_COROUTINE_DURATION = 5 # seconds
INNOCENT_COROUTINE_TIMEOUT = EXPECTED_INNOCENT_COROUTINE_DURATION + 10 # seconds
INNOCENT_COROUTINE_TOTAL_TIME = 0
async def do_nasty_work():
print("Starting a nasty, busy coroutine")
@kanzure
kanzure / pipermail-extractor.sh
Last active December 10, 2024 18:27
Use the bitcoin-dev pipermail archive and create individual files for each email, using the pipermail order for counting. This should create an identical mapping between URLs from lists.linuxfoundation.org and the actual emails. I use this mapping on gnusha.org/url for the redirect service. Try using https://gnusha.org/url if you need redirect s…
#!/bin/bash
#
# Used for processing the mbox files found in:
# https://diyhpl.us/~bryan/irc/bitcoin/bitcoin-dev/bitcoin-dev-ml-archive.2024-02-09-004.tar.gz
#
# Why?
#
# Linux Foundation has deprecated lists.linuxfoundation.org, and now we need a url rewriting map
# for the pipermail archive to numbered email files.
#
@kanzure
kanzure / merchantcash.py
Created March 27, 2022 01:36
Webcash merchant demo using python/flask
"""
A small Flask web application to demonstrate a shopping cart with simple checkout.
This file was written mostly by Github Co-pilot!
Only the webcash API endpoint was specialized knowledge.
"""
import secrets
import datetime
import json
@kanzure
kanzure / split_log.py
Last active September 25, 2021 12:56
Split an irssi IRC log file into multiple files (by date)
#!/usr/bin/python
#author: Bryan Bishop <[email protected]>
#date: 2010-03-16
#updated: 2020-02-10
from datetime import datetime
strptime = datetime.strptime
logs = open("irclogs.txt", "r").read().split("\n")
LOG_OPENED = "--- Log opened "
@kanzure
kanzure / timestamper.py
Last active July 15, 2020 02:21
tool for timestamping IRC logs
"""
Timestamps!
This file creates opentimestamps timestamps for daily IRC logs. It works when
running on a daily basis (called from a cronjob) or when called manually with
multiple days since the last run.
The script will identify all new log files and create timestamps for those new
logfiles.
@kanzure
kanzure / blstoy.cpp
Created April 3, 2017 01:09
Short attempt at verifying an aggregate signature using BLS and a single pubkey
#include <iostream>
#include "bls.h"
int main() {
std::cout << "Signature aggregation toy attempt\n";
// create instance of Bls class
bls::Bls my_bls = bls::Bls();
@kanzure
kanzure / find-most-recent-common-block.py
Last active August 23, 2016 03:40
Find the most recent common block between two different, partially synchronized blockchain data stores. Attempt to do so without calling the _batch RPC call for 500,000 getblockhash requests. See https://botbot.me/freenode/bitcoin-core-dev/2016-04-28/?msg=65077020&page=3
"""
Toy demo for finding the most recent common block between a database and a
bitcoind instance. The goal is to correctly handle reorgs and long periods of
downtime if any. The previous implementation used getblockhashes and the _batch
RPC call to get 300,000 blockhashes at a time. Unfortunately bitcoind is not
really made to handle requests of that size. Instead, here is a more optimized
implementation that should take less time and a minimal number of RPC requests.
The output of this is fed into a planner which generates a plan for processing
the blocks between a current blockhash and the last most recent common block.
"""
@kanzure
kanzure / unexpected-docker-build-cache-miss.md
Created February 13, 2016 17:12
Unexpected docker build cache miss after successful --no-cache
git checkout master # setup Dockerfile
docker-compose build # uses cache (expected)
git checkout HEAD^1 # removes some line from Dockerfile
docker-compose build # uses cache (expected)
git checkout master # adds back same line to Dockerfile
docker-compose build --no-cache # does not use input from the cache (expected)
git checkout HEAD^1 # removes same line from Dockerfile
docker-compose build # has unexpected cache miss