#!/usr/bin/env ruby
# frozen_string_literal: true
#
# MiniGPT client v1.0
#
# gem install --user-install ruby-openai
#
# Enable Zsh options for history | |
setopt EXTENDED_HISTORY | |
setopt INC_APPEND_HISTORY | |
setopt SHARE_HISTORY | |
setopt HIST_FIND_NO_DUPS | |
# Clear local history and load a fresh session for the current directory | |
function load_local_history() { | |
if [[ -f .zsh_cmd_history ]]; then | |
# Clear in-memory history and reload only local history |
class RoutingTest < ActionDispatch::IntegrationTest | |
IGNORED_CONTROLLERS = Set[ | |
"Rails::MailersController" | |
] | |
test "no unrouted actions (public controller methods)" do | |
actions_by_controller.each do |controller_path, actions| | |
controller_name = "#{controller_path.camelize}Controller" | |
next if IGNORED_CONTROLLERS.include?(controller_name) |
require "strscan" | |
class Scanner | |
TOK = [] | |
TOK["{".ord] = :LBRACE; TOK["}".ord] = :RBRACE; TOK[";".ord] = :SEMI | |
def initialize data | |
@scan = StringScanner.new data | |
@prev_pos = @scan.pos | |
end |
#!/Users/aaron/.rubies/arm64/ruby-trunk/bin/ruby | |
# This is a demo language server for Ruby, written in Ruby. It just checks | |
# the syntax of Ruby programs when you save them. | |
# | |
# Configure this in Vim by adding the vim-lsp plugin, then doing this | |
# in your vimrc: | |
# | |
# au User lsp_setup | |
# \ lsp#register_server({ |
# main | |
llama-index | |
langchain |
ChatGPT appeared like an explosion on all my social media timelines in early December 2022. While I keep up with machine learning as an industry, I wasn't focused so much on this particular corner, and all the screenshots seemed like they came out of nowhere. What was this model? How did the chat prompting work? What was the context of OpenAI doing this work and collecting my prompts for training data?
I decided to do a quick investigation. Here's all the information I've found so far. I'm aggregating and synthesizing it as I go, so it's currently changing pretty frequently.
#!/bin/bash | |
BASE_PATH=/var/opt/gitlab/gitlab-rails/shared/registry/docker/registry/v2/repositories | |
DRY_RUN=0 | |
KEEP_LAST_IMAGES=10 | |
RUN_GARBAGE_COLLECTOR=0 | |
GITLAB_CTL_COMMAND=`which gitlab-ctl` | |
It's relatively easy to scale out stateless web applications. You often only need a reverse proxy. But for those stateful web applications, especially those applications that embeds websocket services in them, it's always a pain to distribute them in a cluster. The traditional way is introducing some external services like Redis to handle pubsub, however, in such way, you often need to change your code. Can Erlang/Elixir, the "concurrency oriented programming languages", best other languages in this use case? Has Phoenix framework already integrated the solution of horizontally scaling websocket? I'll do an experiment to prove (or disprove) that.