A spot where I slipped up in trying to adopt Temporal in an existing Python project and then again in starting a new Python project was in defining a Workflow that invokes an Activity that calls a third party library. Temporal outputs an error message with a long stacktrace that I vaguely understood but didn’t immediately know the solution to ... raise RestrictedWorkflowAccessError(f"{self.name}.{name}") temporalio.worker.workflow_sandbox._restrictions.RestrictedWorkflowAccessError: Cannot access http.server.BaseHTTPRequestHandler.responses from inside a workflow.
I wanted to stop the Obsidian editor cursor from blinking. Something like VS Code’s { "editor.cursorBlinking": "solid" } Some searching turned up an option to solve this problem in Vim mode using CSS, but in insert mode, the cursor still blinks. Eventually, I came across a macOS-based approach to solve this issue on StackExchange, included here for convenience defaults write -g NSTextInsertionPointBlinkPeriod -float 10000 defaults write -g NSTextInsertionPointBlinkPeriodOn -float 10000 defaults write -g NSTextInsertionPointBlinkPeriodOff -float 10000 After running, restart Obsidian and the cursor no longer blinks.
Cursor is VS Code with Cmd+K that opens a text box that can do text generation based on a prompt. When I created this post, I first typed insert hugo yaml markdown frontmatter In a few seconds, the editor output --- title: "Cursor Introduction" date: 2023-08-12T20:00:00-04:00 draft: false tags: - cursor - intro --- This was almost exactly what I was looking for except the date was not quite right, so I corrected that and accepted the generation.
The problem with long running code in Next serverless functions The current design paradigm at the time of this writing is called App Router. Next.js and Vercel provide a simple mechanism for writing and deploying cloud functions that expose HTTP endpoints for your frontend site to call. However, sometimes you want to asynchronously do work on the backend in a way that doesn’t block a frontend caller, needs to move on.
First attempt I made an attempt to setup TypeChat to see what’s happening on the Node/TypeScript side of language model prompting. I’m less familiar with TypeScript than Python, so I expected to learn some things during the setup. The project provides example projects within the repo, so I tried to pattern off of one of those to get the sentiment classifier example running. I manage node with asdf. I’d like to do this with nix one day but I’m not quite comfortable enough with that yet to prevent it from become its own rabbit hole.
I downloaded warp today. I’ve been using iTerm2 for years. It’s worked well for me but Warp came recommended and so I figured I should be willing to give something different a chance. Warp looks like a pretty standard terminal except you need to sign-in, as with most things SaaS these days. It looks like the beta is free but there is a paid version for teams. Warp puts “workflows” as first class citizens of the editor experience.
promptfoo is a Javascript library and CLI for testing and evaluating LLM output quality. It’s straightforward to install and get up and running quickly. As a first experiment, I’ve used it to compare the output of three similar prompts that specify their output structure using different modes of schema definition. To get started mkdir prompt_comparison cd prompt_comparison promptfoo init The scaffold creates a prompts.txt file, and this is where I wrote a parameterized prompt to classify and extract data from a support message.

Nix Language

To broaden my knowledge of nix, I’m working through an Overview of the Nix Language. Most of the data types and structures are relatively self-explanatory in the context of modern programming languages. Double single quotes strip leading spaces. '' s '' == "s " Functions are a bit unexpected visually, but simply enough with an accompanying explanation. For example, the following is a named function f with two arguments x and y.

Zero to Nix

I started working through the Zero to Nix guide. This is a light introduction that touch on a few of the command line tools that come with nix and how they can be used to build local and remote projects and enter developer environments. While many of the examples are high level concept you’d probably apply when developing with nix, flake templates are one thing I could imagine returning to often.
I’ve been following the “AI engineering framework” marvin for several months now. In addition to openai_function_call, it’s currently one of my favorite abstractions built on top of a language model. The docs are quite good, but as a quick demo, I’ve ported over a simplified version of an example from an earlier post, this time using marvin. import json import marvin from marvin import ai_model from pydantic import ( BaseModel, ) from typing import ( List, ) marvin.