In the latest ACME draft 15, Let’s Encrypt introduced POST-as-GET requests. It is a breaking change that is not downward compatible to previous drafts.
This brought me into an uncomfortable position. While the Pebble server enforces the use of POST-as-GET, other servers don’t support it yet, like the Let’s Encrypt server. For this reason, acme4j needs to support both the pre-draft-15 GET requests and the post-draft-15 POST-as-GET requests. Luckily I have found a solution that is totally transparent to the user, at least as long as no other ACME server is used.
This is how acme4j v2.4 works:
- If you connect to Boulder via an
acme://letsencrypt.orgURI, acme4j falls back to a compatibility mode that still sends GET requests. Let’s Encrypt has announced a sunset date for GET requests on November 1st, 2019. You are safe to use acme4j v2.4 (and older versions) up to this date. - If you connect to a Pebble server via an
acme://pebbleURI, the new POST-as-GET requests are used. - If you connect to a different server implementation via
http:orhttps:URI, acme4j sends POST-as-GET requests now. This is likely going to fail at runtime, if the server you connect to does not support draft-15 yet. - As a temporary workaround, you can add a
postasget=falseparameter to the server URI (e.g.https://localhost:14000/dir?postasget=false) to make acme4j enter the fallback mode and send GET requests again.
As soon as Let’s Encrypt supports POST-as-GET on their production servers, I will remove the fallback mode from acme4j again. It just clutters the code, and I also have no proper way to test it after that.
Hint: Before updating acme4j, always have a look at the migration guide. It will show you where you can expect compatibility issues.
Just in time for Halloween 🎃, I made a ghost decoration that uses an Adafruit Circuit Playground Express.
The ceramic ghost is from a home decoration shop. I have put a little piece of sandwich paper inside, so the LED light can be seen through the ghost’s mouth and eyes.
The MicroPython source shows a candle light effect. For the flame, a mystic cyan color is used, so the ghost appears really spooky. 👻
If you copy .wav files to the Circuit Playground, a random sound effect is played from time to time. I found nice free sound effects on soundbible.com that surely give everyone the chills. The sound files should be converted to mono and 16 kHz sampling rate, so they fit into the tiny Playground Express memory. The sound effects can be muted using the switch on the Playground, if they should become too annoying. 😉
This is the code.py file that is to be copied to the Circuit Playground. The candle simulation was inspired by an example at AdaFruit. I made the candle color configurable and added the sound effects.
import math
import os
import random
import time
from adafruit_circuitplayground.express import cpx
import audioio
import board
import digitalio
CANDLE_COLOR = (0x00, 0xC0, 0xFF)
MIN_BRIGHTNESS = 64
MAX_BRIGHTNESS = 191
EFFECT_COLOR = (0xFF, 0x00, 0x00)
STEPS_BEFORE_EFFECT = 100
def split(first, second, offset):
if offset != 0:
mid = ((first + second + 1) / 2 + random.randint(-offset, offset))
offset = int(offset / 2)
split(first, mid, offset)
split(mid, second, offset)
else:
level = math.pow(first / 255.0, 2.7) * 255.0 + 0.5
cpx.pixels.fill((
min(255, int(level * CANDLE_COLOR[0] / 256)),
min(255, int(level * CANDLE_COLOR[1] / 256)),
min(255, int(level * CANDLE_COLOR[2] / 256))
))
cpx.pixels.show()
def colorShift(start, end, time):
for i in range(1, time):
j = time - i
cpx.pixels.fill((
min(255, int((start[0] * j + end[0] * i) / time)),
min(255, int((start[1] * j + end[1] * i) / time)),
min(255, int((start[2] * j + end[2] * i) / time))
))
cpx.pixels.show()
cpx.pixels.brightness = 0.4
cpx.pixels.auto_write = False
cpx.pixels.fill((0, 0, 0))
cpx.pixels.show()
sounds = [f for f in os.listdir() if f.endswith('.wav')]
count = 0
prev = (MAX_BRIGHTNESS + MIN_BRIGHTNESS) / 2
while True:
lvl = random.randint(MIN_BRIGHTNESS, MAX_BRIGHTNESS)
split(prev, lvl, 32)
prev = lvl
count += 1
if (count > STEPS_BEFORE_EFFECT):
count = 0
if cpx.switch:
oldcolor = cpx.pixels[0]
cpx.pixels.fill(EFFECT_COLOR)
cpx.pixels.show()
cpx.play_file(random.choice(sounds))
colorShift(EFFECT_COLOR, oldcolor, 10)
The season of long winter nights is coming, so I got myself an AdaFruit Circuit Playground Express for some home decoration.
My plan is to program it using CircuitPython, a MicroPython derivate that is adapted to the AdaFruit hardware.
CircuitPython must be installed to the Circuit Playground first, which turned out to be difficult with Fedora Linux in a first attempt. The troublemaker was the ModemManager, which is still installed by default. It detects the serial port of the AdaFruit device, and then hogs this resource because, well, it might be a modem. 🙄
My older readers certainly still remember what a modem is. 😉 But to make a long story short, almost no one is using a serial modem nowadays, so the ModemManager does not serve any useful purpose. However, it cannot be removed, because other important packages still depend on it. The only way is to stop it permanently:
sudo systemctl stop ModemManager
sudo systemctl disable ModemManager
After that, I could finally install CircuitPython to the Circuit Playground. First I downloaded the matching uf2 file. After that, I connected the Playground via USB, and changed to the bootloader mode by pressing its reset button twice (like a double-click).
The Circuit Playground is now mounted as an USB drive called CPLAYBOOT. All there is to do now is to copy the uf2 file to this drive. The Playground automatically reboots after that. If everything went fine, it will come back as an USB drive called CIRCUITPY.
The next step is to install the AdaFruit libraries to that drive. I downloaded the latest library bundle that matched my CircuitPython version, and just unpacked it to the CIRCUITPY drive.
That’s it… All there is to do now, is to open the code.py file in an editor and change the code. It is immediately executed when it is saved.
For a start, there are a lot of code examples for the Circuit Playground Express on the AdaFruit web site.
A simple method to keep burglars away from your home is a TV simulator. It’s basically a device with some bright RGB LEDs showing color patterns that resemble those of a telly turned on. They are sold at many electronic retailers. However, some customer reviews say that the color patterns do not look very realistic, with some distinctive flashes and colors that are usually not to be seen on a regular movie. Besides that, the color patterns often repeat after about half an hour.
Actually, distinctive color patterns that repeat after a short time, are a major disadvantage. Experienced burglars might recognize the color patterns and figure out it’s a TV simulator. This would rather be an invitation than a deterrent.
So, let’s build a better TV simulator ourselves. It’s also more fun than buying something ready.
The Idea
What do we actually see when we watch a telly from outside a room? Usually there are colors with a medium saturation, changing slowly while the actors move or the camera pans. And there are very few hard cuts in the hue or brightness when the scene changes, when there is an explosion in an action movie, or something like that.
We could simulate these effects randomly, but what would look more realistic than using a real movie as a source? The idea is to squeeze a real movie stream into a sequence of RGB colors that are small enough to fit into an Arduino Uno with just 32 KByte of flash space.
You say that’s impossible? It isn’t! 😁
The Ingredients
I used the following parts for the TV simulator. They are rather cheap and can be purchased in many electronic shops:
- Arduino Uno
- Velleman KA01 RGB shield. Any other RGB shield or a self-made solution should work as well, maybe with some minor modifications to the code.
- RGB LED strip (12 V, about 7 W, Common Anode)
- 12 V wall power supply for supplying the Arduino and the RGB LED strip (1 A or more)
Now just connect the RGB shield to the Arduino, and connect the LED strip to the RGB shield. That’s all, hardware-wise.
Caution: The RGB LED strip is a very intense light source that can cause permanent damage to your eyes! Avoid looking directly into the LEDs!
The Software
You can find the source codes of the Arduino sketch and the movie converter in my GitHub repository.
The movie converter is written in Python and uses OpenCV. Basically, it converts an mpeg stream to a source code that can be included in the Arduino sketch. But there are also a few parameters you can toy around with to improve the result.
The next step is to compile the sketch and upload it to the Arduino. The movie should start playing back immediately.
Actually, the encoding is so efficient that you can cram a full-featured movie of 2 hours into the tiny Arduino memory.
Behind the Curtains
How does the converter work? And why is it so efficient?
First of all, as nobody is going to actually watch the simulated movie, we can remove the audio tracks and drastically reduce the frame rate. 10 frames per second is the default, but if the Arduino memory is too small for your movie, you can even go down to 5 frames per second.
Also, unlike a real TV, the RGB shield is only able to generate a single color, so all we need is the movie as a stream that is scaled down to a single pixel.
The converter software now analyzes the single-pixel movie stream. For soft color changes (like fixed camera positions or slow camera pans), it only stores the target color and the time it takes to reach it. When playing back, the Arduino will just gradientally change from the current color to the target color in the given time. For hard color changes (like jump cuts or in action scenes), the change to the target color will happen immediately.
Movies mostly have soft color changes, and only a few hard changes, so several seconds of video material can be stuffed into just a few bytes. This is why the converter is so efficient.
The algorithm is pretty simple, but gives amazingly good results. Especially on action movies, with a lot of cuts and striking action scenes.
The luftdaten.info project offers a build guide for a fine dust sensor. It’s cheap and easy to put together even with little electronics knowledge. You can get the parts in various electronics shops and nowadays even on Amazon.
The original kit uses two plastic pipes as a casing. They’re cheap and easy to get in any DIY store, but they don’t look particularly great. I chose a standard UV and weatherproof outdoor junction box as the casing instead. A custom-designed 3D-printed frame is placed inside, and the electronics are mounted on it.
The frame already has a wind tunnel for the drawn-in air, so unlike the original instructions, you don’t need a hose. Grids in front of the air vents stop insects from crawling into the case. Also, unlike some other printed solutions, the fine dust sensor is aligned as specified by the manufacturer, and the intake hole is protected from light.
Unlike the original guide, though, you can’t avoid picking up a soldering iron here.
You’ll need the following parts:
- 1x set of 3D-printed frame parts
- 1x OBO Bettermann T60 junction box with plug-in seals
- 1x NodeMCU ESP8266 (by Lolin, other brands might not fit)
- 1x ILS-Nova SDS011 fine dust sensor
- 1x DHT22 temperature and humidity sensor (usually optional, but required here as it seals off a hole in the wind tunnel)
- 1x BMP180 temperature and air pressure sensor (optional)
- 11x wood screws 3.0 x 12 mm
- 4x wood screws 3.5 x 12 mm (or four more 3.0 x 12 mm)
- A bit of ribbon cable
- Some heat shrink tubing
- USB cable (flat)
- USB power supply (an old mobile phone charger is absolutely fine, the sensor needs less than 200 mA)
You don’t need any particularly UV or weatherproof filament for the printed parts, since they’re protected by the junction box. The filament just shouldn’t be so brittle that the screws break it. And it should be as dark as possible so no stray light gets into the dust sensor’s opening. I used basic black PLA.
Important: The print should be done without supports, as they are really hard to remove afterwards and could block the air duct. The parts are designed so they can be printed with PLA even without supports.
First off, remove the two plug-in seals from one side of the junction box and cut them open at a 19-20 mm diameter.
Now poke the USB cable through the left hole, then pop the lower support frame in from the inside and screw it down with the 3.5 x 12 mm screws. The casing can now be sealed up again with the plug-in seals.
It’s time to solder the electronics together according to the guide. The pins of the DHT22 should be protected with the heat shrink tubing. How to wire up the optional BMP180 air pressure sensor is explained in the FAQ. It’s best to install the firmware now and do a test run. Once the device is put together, it might be trickier.
Screw the NodeMCU and the BMP180 onto the base plate and plug the USB cable in. Just slot the DHT22 into its designated spot in the wind tunnel (see photo), it doesn’t need to be screwed down.
Next, pop the top plate on and screw it down. Connect the fine dust sensor, then carefully slide it into the air vent and screw it in place. You might need to snip a tiny bit off the plug to make it fit.
That’s it. You can pop the junction box lid on and secure it. The sensor is ready to go.
Mount the fully assembled sensor on an outside wall with the openings facing downwards. The casing should easily survive rain, snow, and hail. Since PLA goes soft at 60°C, you should avoid a spot in direct sunlight though. The sensor can also be operated lying flat, but the openings should be protected a bit from the rain then.
The sensor is powered via the USB cable. Since no data is transmitted, the cable can easily be several metres long.
Once you’ve registered your sensor at luftdaten.info, you can see the measurement data on the map and download it as a CSV file. The current measurement data can also be queried directly via Wi-Fi in JSON format and stored in a database, for example. I use a little custom-programmed tool that fetches the data regularly, drops it into a PostgreSQL database, and displays it via Grafana.
Have fun tinkering! 😀







