I Refuse to Admit Failure… YET.

I finally picked up one of those 8x8x8 LED cube matrix kits. I’m a sucker for blinkyshit, all the DC540 regulars know that. I’m doing the rare thing here in documenting before the resolution of all of the issues, just because the processes deserve documentation, I think.

I am by no means a hardware expert. I stand on the shoulders of the entire internet when it comes to mucking about with programming microcontrollers. I’ve gotten better, but it’s still not innate to me the way other aspects of technology are. There are just too many microcontrollers, and too many ways of poking at them. I2C, SPI, JTAG, sometimes it seems almost overwhelming.

But here we are, with this STC12CA60S2 microcontroller, already installed on the PCB. I went through all the steps over the weekend of soldering all 512 LEDs and the other chips and small parts. I don’t know about you, but when I get close to the end of a project like this, the anticipation starts to really kick in. If I’m not careful, it’s easy to get sloppy and make a stupid mistake. But I didn’t, this time. I did find myself short on LEDs. The kit came with extras of most of the small parts, but inexplicably, only the exact number of LEDs, and two of them were DOA. So I had to order replacements from another supplier, and I didn’t think to order long-leg LEDs for the replacements, so I really had to work a bit to fit them in.

So here we are, it’s all assembled, looks great from a distance, but up close you can see my sloppy skills. This is how the Captcha protections should work, they should evaluate us on our assembly skills. Clearly I am not a robot.

From the instructions I found, the STC12 is supposed to be pre-programmed, and I should just be able to apply power and see the animations. No such luck. It illuminates a block of LEDs, but no animation. To be thorough, I double-checked all the chip orientations, and double-checked all LED paths by using my bench power supply and applying 3V to each power vertical and grounding each ground horizontal to confirm that every LED is “addressable.” I suspect from Internet research that they lapsed and sent me an un-programmed STC12, because it’s documented that this happens. Not a problem, I’m up for the challenge, I’ll figure this out.

Let’s see. It wants a UART USB TTL serial device. Four-pin header. VCC, GND, P30 (RX) and P31 (TX). Well, I don’t have the Adafruit programmer they recommend, but I do have a FTDI FT232R. Let’s give that a shot… Nope, it doesn’t seem to recognize the power cycle, it stays on “Waiting for MCU…” even though I cycled power. NOTE: during this process, the devices is powered, 5V, by the USB programmer. Interestingly, and the Internet backs me up on this, the power light remains dimly lit even with the power button off. Several sources report that parasitic power leaking from the TX line can interfere with the power cycle reset process, preventing this from working. It’s possible this is only an issue on these FTDI programmers, and maybe the problem will go away when I use the recommended Adafruit programmer, which arrives today.

But I’m impatient, I WANT IT NOW! So I started scouring the lab to see if I have any other options available to me. Hmm, I have a Bus Pirate, the Swiss army knife of microcontroller programmers. I spent about an hour last night learning it and futzing with it. The Bus Pirate is interesting but cumbersome. You plug it in, then you serial directly to it (I use screen on the Macbook) and configure it for the purpose intended using a manu system. Then I exit screen and do what I would normally do with a dedicated programmer.

The Bus Pirate doesn’t seem to handle the power situation correctly either, but in a different way. It doesn’t seem to know how to power cycle correctly in UART mode. Even if I set power on before running the stcgal command, it shuts power off when I initiate the sequence and never turns it back on again. What if I disconnect power and ground from the programmer to the board and use the cube’s external power supply? I’ll try that after this post, but I don’t have a lot of hope. I tried this tactic with the FTDI and didn’t see any difference. I wonder if part of the process is the programmer detecting voltage via the same pins it provides voltage on. UPDATE: Tried that on the Bus Pirate, no luck. Also tried another suggestion, putting a 10K resistor inline with TX to keep that parasitic power at bay. No luck. Hopefully the Arduino programmer will work.

Another option is that I have one of those ZIF-socket chip programmers. That’ll be a last resort. I prefer not to pull chips off the board, even though they’re socketed, because of the potential for excessive bending and possible breakage of the pins.

Oh well, one way or another I’ll update this already-too-long shitpost later today. I’ve got at least two paths left to explore today.

Is Gitlab a viable Atlassian alternative? Spoiler: maybe?

Maybe you’re one of those stubborn people like me who insists on self-hosting everything. Maybe it’s a requirement due to sensitivity of data, or maybe it’s just pride. In any case, that’s what I was doing. I was proud of my Atlassian setup. I happily paid my $10 each for 10-user licenses of various Atlassian products. Jira, Confluence, Bitbucket.

Everything was fine, and everyone was living happily ever after.

UNTIL.

And this is where I sacrifice my personality for professionalism. In my humble opinion, Atlassian made a huge error in judgement. They decided to end support for their “Server” line of products in favor of “Cloud” and “Data Center.” No more $10 10-user licenses for self-hosted apps. 10-user licenses are FREE now — in the cloud. You want to host it yourself? Fuck you, go get the Data Center version. How much is it? Well, if you have to ask…

And yes, I was holding back. I’m a little bitter.

So here I am, exploring ways I can take my business elsewhere. I’m a simple man with simple needs. I don’t need all the workflow bells and whistles that Jira offers. Hell, we don’t even use most of that at my job. At the core, I need projects and issues. Gitlab has that. And of course Gitlab can do everything that Bitbucket does. What’s left? Hmm, Confluence. Well, I’ll explore that part later. I do know that there’s a “Markdown Exporter” plugin for Confluence that will export “markdown” documents in a way that can be imported into Gitlab, Github and other apps. I just don’t know what the paradigm equivalent is for it just yet.

So let’s start with eradicating Bitbucket.

OK, I built a VM. CentOS 8. Gitlab’s installation instructions are crystal clear. A few prerequisites, an update, and a repo install, then a package installer. Nice, that’s how I like it. OK, they include a LetsEncrypt cert deployment by default. We’ll have to get rid of that, I have my own CA internally, and I issue certs from that. Done, not so hard. Next, SSO. I have FreeIPA in my infrastructure and had integrated the Atlassian products with that. Can I do that with Gitlab? Shit yeah. Easy as chocolate pie. A little bit of finagling with the .rb file and I’m in.

So now on to Bitbucket. Well, they just went and built in the integration/import functionality, just like that. I can give it my bitbucket login and password and import ALL of my bitbucket projects in one session. Lovely. I’m in tears over here. Literally ten minutes after getting Gitlab up and running in my environment, I’ve got all my git repos imported.

How about Jira? Well, it used to be a pain in the ass, when I first looked into it it sounded intimidating. “Well, you’ll need to do REST API queries to both services to translate everything blah blah blah”. Nope. Not anymore. The latest Gitlab has an importer built-in. It’s a little weird and roundabout, but it farging works. Go to, or create, a project. Go to the Issues page within that project. Click the “Import from Jira” button. Here’s where it gets weird. You have to re-enter the Jira integration details for each project before you can import that project’s issues. It would be nice if you could do it once, map the Jira projects to existing projects and choose to ignore or create the rest, and click it. But no problem. It brings them in, correctly lists some of them as closed. etc. It’s just going to take some time, thought and planning.

Confluence integration is going to require its own post, because getting all the confluence data over, including attached files, is going to be important to me. I use it as a home for a whole lot of documentation that I refer to frequently, and I can’t afford to lose it. So stay tuned for more on that.

I’d love to hear what other people are doing. I can’t be the only one dealing with the loss of the nearly-free Server products.

Adding a Wyze Cam-Pan to Octopi

I’ve been using Octopi with my 3D printers for almost as long as I’ve been printing. The whole concept of printing from SD cards just seems alien to me, when Octopi/Octoprint jumps through all the hoops for you. I mean ALL the hoops. Upload your gcode to a web interface, set the print, watch the print, manage temperatures… there is even a spaghetti detection plugin!

One of the biggest benefits is camera integration. Why? To monitor progress, to create stunning time-lapses. The technology has advanced so much that the Octolapse plug-in can detect when the Z-layer changes, move the extruder to an out-of-the-way corner, and take a snapshot of the current state of your print, then continue printing as if nothing happened. This results in beautiful yet creepy time-lapses where the object simply appears to grow out of thin air.

The typical thing to do is integrate directly with the Pi camera. It is perfectly utilitarian, and does the job. I haven’t been super happy with the Pi camera, however. Until recently, I had cobbled together systems to hold a camera in place using helping hands, or whatever other makeshift device I had on hand. Then I 3D-printed a mount to do the same thing. But the quality’s just not there. The resolution is inferior, it doesn’t handle low-light well, and it can’t pan, tilt or zoom, so you’re stuck with manual adjustments.

Then I saw that someone had created a frame mount for a Wyze Cam-Pan for the Ender 5. The Cam-Pan can be found for under $30, and has full HD and PTZ capabilities. Also records and speaks audio, not that I’d need that here. So I printed one, and ordered one, before researching how to integrate it.

Well, by default, the Cam-Pan wants to work like a Ring camera and send its output to the cloud. SAAS is king, apparently. But wait. Wyze offers RTSP firmware for it. That makes it simple, right? Well, not so fast. It makes a decent stream, but it doesn’t seem that Wyze’s RTSP firmware offers the still-image function which is required by the Octolapse plug-in.

Another option Wyze offers is USB Webcam firmware. But that requires a clunky additional wired connection, a USB-A to USB-A cable, from the Pi to the Wyze camera. HATE IT.

Started talking with Kevin about reverse-engineering the Wyze firmware to see if there was hidden functionality, but then I remembered that everything has already been done. So I googled Wyze camera reverse-engineering. First I found some very confusing custom firmware made for the Wyze V1 and V2 cameras. This was getting closer, but I’m specifically looking for the Cam-Pan version. I started reading the “issues” section of the github for that release, which hadn’t been updated in three years, and people were wondering why it exists at all, since at its core it’s a no-further-benefit fork of Xiaomi Dafang firmware for Wyze, which is better documented, more thorough, specifically known to support the Cam-Pan, and was updated just four months ago.

Fast forward a couple hours, and I did it:

  • I flashed the custom bootloader. It’s smart. if it detects the SD card with the custom software on it, it runs that. Else it runs the built-in Wyze version. BRILLIANT.
  • I created the custom SD card, editing wpa_supplicant.conf to connect to my wifi.

I booted it up, and was dismayed. It clearly “works,” in that ssh, web and rtsp ports are open by default, but this was clearly written before the great clampdown on TLS. A self-signed cert with an untrusted root cert, whose name will never match on the common name. I spent at least an hour going down that rabbit hole and trying to bypass it, but here’s the thing. Not only does it have to work on my browser, but it also has to work from the Octoprint installation on the Pi. Since I now have SSH access to the Wyze camera via its new firmware, I logged in to see if it would be easier to just “replace the certs.”

Sure enough, it all hinges on just a cacert and a lighttpd cert. I figured I had little to lose at this point, so I generated a new cert for it, signed by my infrastructure (what, doesn’t everyone run FreeIPA in their basement in 2021?), and dropped these new and authenticated certs into place. I power-cycled the camera. IT CAME BACK UP. And now, at least it ALLOWS me to bypass SSL/TLS errors (apparently my FreeIPA server isn’t too smart about daylight savings time, so I still have twenty minutes before that cert is valid). (This was incorrect, as I discovered today. My infrastructure was using an ntp server that had been removed from my network, and has been WRONG for some time now. I fixed that today!)

The point of the whole TLS exercise is so that I can add the camera to Octolapse, which actually does a verification check to ensure it can read both the stream and the “capture current pic” snapshot link. I suppose I also could have figured out how to turn off https entirely on lighttpd, but I was already nearly done when I thought of that option. Also, it’s not a full linux deployment on that firmware. It’s a busybox/mips stripped down linux with all the configs living on the SD card.

I ended up disabling SSL support anyway, because why not?

The next problem — it looks like Octoprint/Octolapse, between the two of them, need the feed in lots of different ways. The main “control” screen in Octoprint needs it in http/mjpeg — no other feed type will work here and show the current moving image. There are -three- settings within Octolapse — base address, stream and snapshot. And I’m pretty sure the stream option here would take an rtsp stream. The snapshot, however, is ghoulish, in that it needs an actual snapshot function. Why not just build in a function that takes a curent snapshot from the stream? Oh well, I don’t know the capabilities that well, there must have been a good reason for that choice.

Meanwhile, the camera won’t do http mjpeg with any of the firmware I’ve tested so far. The workaround for this seems to be back to the beginning thoughts — use ffmpeg and ffserver on the Pi itself to suck in the rtsp stream and serve it on a different port locally as mjpeg. Again, it seems like an awful lot of load to do something so seemingly simple. But packetwise, they’re staying local, it would be sucking that feed into Octopi anyway. BTW, ffserver requires a specific older version of ffmpeg and is a 2-4 hour compile on the Pi.

After managing to get it “working” with the mjpg custom firmware, I wasn’t happy with the result. The network overhead of the Pi streaming the video from the camera and then re-streaming it internally was too much, and the image ended up being glitchy and problematic. I also think having ffmpeg and ffserver was adding a notable load to the Pi.

So I broke down and bought the USB-A to USB-A cable and flashed the Wyze camera with the Wyze USB-Cam version of the software. Then I ran raspi-config to turn off Pi Camera support. After setting Octoprint back to the Webcam profile and rebooting it, it just picked it up naturally. I did install the uvc software so that I could tweak the settings, but I’m not 100% that was necessary for the camera to work. In any case, I have great video quality and can now tweak some of the exposure and white balance settings on the camera for more dramatic timelapses.

I guess that’s a wrap. Check out the DC540 youtube for future timelapse videos.

3D-printed lithophane “art”

When I first saw examples of 3D-printed lithophanes, I thought it was a great development of the technology for uses that may not have been originally intended. Lithophanes have a long history, predating all of our high-tech methods.

https://itslitho.com/itslitho-blog/one-of-the-most-unusual-artworks-from-the-early-19th-century-the-history-of-lithophane/

But some of the articles I read at first said that doing them with PLA via FDM was impractical, and you’d really need SLA for the increased accuracy and complexity available. But then I recently started finding articles that showed examples with quality white PLA, so I tried it. Sure enough, it’s fantastic.

And it’s ridiculously easy to get quality results. Just upload your image to https://3dp.rocks/lithophane/ and it will make an STL for you. Here’s the tutorial I used:

https://sovol3d.com/blogs/news/tutorial-how-to-print-lithophane-on-your-3d-printer

Lithophane as printed , still on print bed.
Same lithophane held up to a light source.

Latest utility print: Phone mount for Jeep JKU

I’ve tried vent mounts, seat bolt shaft mounts, and others. Everything sucked. But some kind soul designed a phone mount specifically for the area of the dash above the radio. It clips into the dashboard tray and the gap just above the radio, without disrupting buttons if you aim it just right. And it has a cutout for a wireless charger, specifically it seems to be deisgned for the Vinsic extra-slim wireless charger. Got one on the way to try out.

One comment says that a hot day caused enough meltage for his to fall and warp. He was going to try it in another material (PETG) to see if that helped. I haven’t printed in anything besides PLA yet, but it’s about time for my experimentation to move to the next level, so I’ll keep an eye on that.

It printed in three parts — the main front panel and the two side mounts. Fitting the mounts to the front piece was a VERY tight fit, I actually had to shave the mounts a bit with a razor, and still had to tap them in. Nice solid piece overall, and fits my Pixel 3 snugly.

Source: https://www.thingiverse.com/thing:2769593

Cornstarch solves more household problems

Ever since adding the two LACK shelves to my music lab, using the computer in there has been annoying. Actually, it was annoying before. The table is too low to use it while standing, so I either had to bend a little bit or sit. And since adding the monitor, sitting is a non-starter, because I’m staring up into the abyss.

I tried placing the keyboard up on the shelf, but the shelf is fairly narrow, and the monitor base got in the way. So off to Thingiverse I go, yet again, in search of Apple keyboard risers. I found this one, which looked promising. Initially I was looking for something I could adapt to have a slight overhang at the front end, so that it would be anchored to the shelf. But this one, bless the designer’s heart, has three insets for 5mm rubber beading. Two on the bottom, to keep it from sliding on a surface, and one on the top, to help keep the keyboard itself stable, although the channel for the rear undercarriage should keep it well in place.

Now the keyboard is right in front of the monitor, perfect for the occasional standing Google search or setting it up to monitor Octoprint. Rubber beading is on the way, but it’s reasonably stable without it!

Source: Thingiverse: https://www.thingiverse.com/thing:2811956

Filling my life with specialty cornstarch…

Yesterday I decided I was tired of using random nearby objects as a riser for purposes of elevating my airpod case in the wireless charger to where it can reach the level of the charger.

I started looking on Thingiverse for random airpod cases to see if one could be adapted for this purpose. Lo and behold, I found one. This one was offered as a Tesla-branded holder, but I chose the unbranded alternative that was also offered.

It sits nicely in the cradle, and elevates the case to just the right level to receive a charge.

Did you know that PLA is a thermoplastic derived from cornstarch? This was a recent revelation for me. It’s even compostable!

I find myself paying more attention to household annoyances and organizational challenges that can be solved with 3D-printed objects. My eye is frequently with the intent to design something, but every time so far, I have found something on Thingiverse that can be used without having to design from scratch. I am grateful to Thingiverse for providing this platform, and to all of the makers out there that put their designs out there for us to use and remix.

“Making” useful things

I printed a bunch of these today. Dual purpose, really. The intended purpose is to solve the “tangled filament” issue, where loose filament on the spool backs up when not under tension, and crosses under another row, and when being fed, it sometimes catches and stops feeding. If you don’t catch it, it will fuck up a print for sure.

Dragon clips

They’re called dragon clips, and they clip to the side of the spool, and have a smaller clip into which the filament clips snugly, keeping it taut when not feeding the printer.

Dragon clip in use

But when I printed this, I had an additional use in mind. My UV LED strips around my music lab tend to fall away from the wall mirrors they’re attached to as the adhesive backing fails over time. The strain of cables pulling on them tends to amplify the problem.

I was really hoping these clips would fit behind the mirror and hold the LED strips in place, and it looks like they’re going to work fine for that purpose.

Simple fitness timer / Arduino Uno R3

I’ve been working out religiously this year, and reading up on getting the most out of my workouts. I read something about optimizing the rest time between sets, and I started wondering how to best track that. The phone is a pain, because unlocking it, getting to an app, etc. is distracting and takes time. Laptop, same thing, plus I have to have a laptop in my workout area.

I started thinking, well, I have a whole bunch of unused Arduino Uno units and a 3D printer, maybe I could cobble something together. So what do I need? I want simplicity, so a box with buttons and a display.

I found this case shape on Thingiverse: https://www.thingiverse.com/thing:845415

It’s built for an Arduino Uno and the DFRobot LCD 1602 shield with buttons. The case even has 3D-printed button extenders that are used to press the buttons on the shield. Simple yet elegant. Ticks all the boxes.

I ordered the HiLetGo shield for $6.49 here: https://www.amazon.com/gp/product/B00OGYXN8C/ref=ppx_yo_dt_b_asin_image_o01_s00?ie=UTF8&psc=1

…and started printing the case. I decided to print in two colors for cool points. Black for the bottom case and buttons, silver silk for the top. Came out OK. I might sand & finish it later.

The shield arrived today. I quickly cobbled together some test code, attached the shield to one of my spare Unos and uploaded the sketch.

Nothing on the display. The backlit was lit, but nothing on the display.

Turned on serial in the code and added some debugging. Uploaded again. OK, good news, the Arduino is working, it’s moving around in the code, my button presses are recognized.

I tried adjusting the contrast via the onboard pot. Nothing.

Went to the amazon reviews/comments for the device. Finally found the relevant comment: “Adjust the ‘pot’ for contrast just realize this style pot takes ALOT of turn to get to the other end.”

So, with my tiny screwdriver, I turned and turned, for way longer than I would have expected, and eventually my sample text showed up.

Now I just have to write a quick sketch for button-initiated 60- and 90-second timers, and I’m good to go! Quick, no-fuss timer with customization capabilities and a simple interface and display. Might add a sound feedback in a future iteration.

UPDATE: Here’s my first stab at a script. Up gives a 60-second countdown, Down gives a 90-second countdown, and the screen shuts off when it’s done so you don’t even have to watch it closely. Screen shuts off after initial display, SELECT to see the very few instructions again.

https://github.com/dc540/arduino1602ShieldTimer

UPDATE 2: Here’s a link to a buzzer that will fit in this exact case. It’s not super loud when wired directly. Supposedly a transistor might kick it up a notch. https://www.adafruit.com/product/1740

As promised, Anet A8 vs Ender 5 comparison

And I realize this isn’t apples to apples. Other factors are involved. A lot of people have been able to get the Anet to print way better than this. But there’s a reason there’s a long list of “essential upgrades” for the Anet. It is a rock-bottom DIY machine. I’m not even going to call it entry-level. For entry-level, a primary requirement is ease-of-use. Lower quality can be excused, but entry-level should not frustrate users out of the hobby.

Left: Anet A8. Right: Same piece, Creality3D Ender 5

Here are two prints of the exact same part, side by side. Same Cura settings, good bed leveling. The piece is designed to hold glue sticks. The piece on the left, because of the inaccurate circle size printed, won’t even accept the glue stick. It’s just OFF. Can’t even jam it in there. No dice. The piece printed by the Ender, however, has well-rounded holes, and the glue stick just slips in with no friction as intended. The print is within spec.

Now for a closer look.

Anet A8 hole close-up
Ender 5 hole close-up

See how much neater and rounder the holes are on the Ender 5? I attribute this to the Anet having an acrylic frame and just being overall wobbly. And with an acrylic frame, you can’t really tighten it too much, else it will crack the acrylic. Accuracy is critical when printing parts that interact with other parts. Also note the uniformity in the fill lines. Still not perfect on this print, but way tighter. This is not because the nozzle is different in any way. It’s the same sized nozzle.

Next let’s look at the layer stacking.

Anet A8 layer stacking
Ender 5 layer stacking

Just so much cleaner.

Again, some of this is definitely attributable to my own skill level. I’m still relatively new at this. A proven amateur. My point is that I got these results from the Ender 5 with a stock, straight out of the box machine, after only adjusting the extruder steps/mm. This is the way.

I still had a bit of an issue with edge curl-up on this print, you might notice that in the side-by-side. I printed the next piece, which is nearly identical, with a brim, and also brought the nozzle just a hair closer to the bed, and about four hours in to a 13 hour print, it looks like edge curl isn’t going to be an issue with this one. As I get more attuned to this machine, I should get better at predicting and preventing issues like that.

Update: Indeed, the brim held, and the next two pieces were virtually flawless, with no corner curl-up. It’s possible the problem was nozzle distance and it would print fine without a brim. Further testing will reveal that, when I’m not trying to complete pieces.

New Store Who Dis Dismiss