Scooter

We had to say goodbye to Scooter on Friday. He was 13. It was a rough week.

We euthanized Yoshi in our home on Monday. That day Scooter started limping. It wasn’t anything new. He was part Shar-Pei and would get Shar-Pei fever from time to time. It would clear up in 24-36 hours and he would go back to normal.

Tuesday night we were concerned. Wednesday we took him to the vet, then the emergency vet. We tried heroics but, like most heroics, was more about trying our hardest than getting the outcome we wanted.

It seemed like it was neurological, like something broke when he saw Yoshi die. The dogs had been together almost every day for 12 years, so maybe Scooter couldn’t bear to be without his brother.

The dogs weren’t brothers, although people would ask. Scooter looked exactly like Yoshi in his adoption photo, and completely different in person. They were both Shar-Pei mixes but where Yoshi had bright ears, fluffy fur, and a Shar-Pei’s curly tail, Scooter had Shar-Pei’s covered ears, short coat, and a Pit’s long straight tail. Where Yoshi was aloof, Scooter was bursting with love; even by dog’s standards.

Scooter got his name from the rescue, and we couldn’t think of anything we liked better when he came home. Luckily he didn’t have a habit of scooting, and I only made the connection a few months after we got him. The rescue insisted on seeing pictures of the tall privacy fence in our backyard at the time. We found out why when he leapt the 3 foot gate to greet us in the front yard.

He was vocal, and we learned that he wasn’t growling but making playful noises. And he loved to play. He would chase and wrestle and zoom even in his senior years. Anything to get attention from the people he loved.

Losing one dog is a gut punch. Losing two in the same week has shocked our sense of home. It’s so much quieter in the house, despite having two kids under 10. We can enter the house without running a gauntlet of wagging tails. The robot vacuum is finally making progress in its war on fur. There’s no one to let out before bed, and I don’t need to lock the back door because it was never unlocked.

Scooter, you have left a hole in our family and we will always miss you.

Yoshi

Our best guess is that Yoshi was part Shar-Pei and part Lab. He got his stubborness and curly tail from his Shar-Pei side and his shedding from his Lab side. He loved to curl up with his humans. It was an honor to wake up with him balled up in the crook of your legs or his head resting on your shin.

Yoshi came to our home on July 23, 2008. He was about 1 year old. At the time we had Buddy, an older full Shar-Pei. Having learned the health problems that come with a pure Shar-Pei we decided to adopt a mixed breed.

We were told that Yoshi was left chained up outside before being surrendered to the humane society where we got him. When we got him to our second story apartment, he froze when he saw the stairs. We had to carry a scared, unfamiliar, 55 lb dog up the stairs for the first few days we had him. He got the hang of it eventually.

We had a couch that backed up to a sliding glass door. Yoshi would perch himself on the back of the couch like a cat, or even stand on it to get a better view of what was going on outside. We joked that he must be part mountain goat.

He was always a bit leery of everything. He always had a wrinkled brow so he always looked worried, but if you watched his ears you could tell if he was excited or concerned. We kept his crate for him long after we stopped shutting the door. He always needed a safe place to be. After the crate he would always find corners and closets where he could relax while still keeping an eye on things.

Yoshi would love to roll in the sun, any time of year. If the sun was shining Yoshi would love to wiggle his back in the grass, fallen leaves, or snow.

When he would shed we made a game of gently pulling small tufts of fur that were sticking out, and marvelling at how much they bloomed into giant puffs. It’s dumb, but I think that’s part of being a family.

When he went deaf, we didn’t even notice at first. He was stubborn so not listening was par for the course. Once he stopped barking at the doorbell we knew his hearing was actually going, and I selfishly wished it had happened when we had napping babies.

Goodbye Yoshi.

this-is-fine.eml

This is a legitimate email from a Legitimate Financial Institution that I was expecting, but I don’t know what I could change it to make it look more like a phishing scam.

We’ve spent the last 20 years teaching people not to open email attachments but I guess Raytheon’s Cybersecurity company didn’t hear about that?

I did open the attachment in an isolated browser after reading the source. Inside was a button that takes you to their secure messaging site’s onboarding flow. There’s 90K of data POSTed in hidden fields, so I suspect the constraints that led to this were:

  • Our security platform generates 90K of data to authenticate that the source of this request is legitimate.
  • That’s too much to add as query string parameters on a GET request, but it works for a POST.
  • Support for forms in email clients is poor, so we need to put the form into an HTML attachment.

Each step solves the previous problem, but at no point did anyone with the power to fix things step in and stop it. They didn’t say “90K is too much, find another way to authenticate the source of the request, preferably less than 2K.” If I had to guess, the people who had the information about how bad the implementation is were completely removed from the people setting the requirements.

All of that exists, too! If you follow the mobile instructions and forward the email to that unknown email address, it generates a very reasonably sized link, that takes you to an HTML page hosted by them, where you can POST 90k of hidden data and read your secure message.

Despite how awful this system is, I’m still glad that Legitimate Financial Institution is using some secure messaging service to collect my loan documents (I’m getting a loan to buy solar panels) and not asking for them to be faxed or emailed plain text.

Firefox Tip – Use the ^ Search Operator

This list of 11 secret Firefox tips is fantastic! Number 2 will change your life! OK, that’s overselling it a lot. Many of those I already knew, but the second tip really is great:

Search for a needle in a tabstack

Tab hoarders, we see you. Heck, we are you. Don’t ever let anyone shame you for having dozens (and dozens3) of open tabs, implying you don’t have it together and can’t find the right one. Instead, dazzle them with this trick. Add a % sign to your URL search bar to search specifically through all your open tabs, including tabs in different windows. Then you can click over to the already open tab instead of creating a duplicate, not that anyone has ever done that.

Bonus tip: If you love that tab search trick, try searching through your Bookmarks with * or your History with ^.

I’ve (somewhat) cut down on my tab hoarding thanks to One Tab and Pinboard, but my Firefox history is massive. Being able to search my history with ^ is a game-changer for me. I’m actively trying to build up my muscle memory for using that search operator.

Get Your Blog Posts on Mastodon

Here’s a full list of steps to get your blog posts on Mastodon:

  1. Install and activate the ActivityPub plugin for WordPress

That’s it! Thank you for following along.

OK, I’d actually like to say a bit more. When I first installed the plugin, I was trying to figure out how to connect it to my Mastodon account. If you’re using WordPress, it’s straight-forward to get your blog posts on Twitter, Facebook, Tumblr, and LinkedIn thanks to Jetpack (I work for Automattic who makes Jetpack, but I don’t work on Jetpack). Jetpack works by connecting to those sites’ APIs with your account, and then posting to your account. I assumed that the ActivityPub plugin would work similarly.

But Mastodon isn’t like any of those other sites. Since anyone can run a Mastodon server, and Mastodon speaks the ActivityPub protocol, the plugin turns your blog into a server in the ActivityPub network (the “Fediverse”). You don’t need another account, your WordPress account is your account.

So how do you actually see your blog posts on Mastodon?

That part actually tripped me up. You have to go to /wp-admin/profile.php to see your Mastodon ID. I don’t know where I read that, but it wasn’t obvious to me. At the bottom it tells me that my ID is @georgehotelling@g13g.blog so I searched for that on my Mastodon server and was able to follow my blog. The format seems to be [username]@[hostname] but you should check your profile page just to be sure.

Wishlist

There are a few things I’d like to see from this plugin in the future:

  1. Remote Follow. I’d love to be able to create a remote-follow page for my blog, similar to how Mastodon has a remote follow page. The Social Icons block in Gutenberg already has a Mastodon logo, but it would be nice to be able to link to a page that lets you subscribe, like on Mastodon.
  2. Customizable Feeds. Another thing I’d like to see is the ability to create and name feeds for posts that match a WP_Query. First off I could have a shorter ID like @blog@g13g.blog or something. I could also use it with Custom Post Types to have short links (like Waxy or Kottke) on one feed, maybe photos on anther feed for Pixelfed.
  3. Replies as Comments. If someone replies to my post anywhere on the Fediverse, I want the option to include that in the comments section.

Oh, and about Mastodon…

I have said it elsewhere but I love my Mastodon instance. It reminds me of BBSs back in the day, or maybe the local Livejournal group. It’s cozy with a lot of familiar faces. 90% of the content on the instance is marked “followers only” for privacy, so if you’re interested be sure to follow people you see mentioned.

Mastodon, like blogs and RSS, is also one of the last places on the web where you don’t have an algorithm choosing what you see. No one is optimizing tho software for engagement metrics. That alone is pretty valuable to me.

Open Source Ambilight LEDs on a Raspberry Pi for $100

I made this for about $100 with a Raspberry Pi and no soldering:

Fluid Sim Hue Test on YouTube

I’ve always thought that Philips Ambilight TVs were cool. They do what you see in that video: shine the edge colors past the TV. But it was always a “nice to have,” so when I was buying my TV I prioritized other features. Later, Philips launched the Hue Play HDMI Sync Box, which would let you create an Ambilight effect with Hue light strips. Again, cool, but not $300-and-tied-to-a-proprietary-system cool.

BTW, the generic name for “Ambilight” is “bias lighting,” so I’m going to start writing that instead. Aside from looking cool, I’ve had some eye strain issues with my TV and heard that bias lighting could help with that. After using it for about a week I can say that yes, it does!

I’ve also known that LED light strips are really cool to work with, but it’s been years since I soldered anything so it was all pretty intimidating. Also intimidating: flashing microcontrollers like the ESP32. I’ve always felt more comfortable with a Raspberry Pi because it’s a Unix system; I know this. When I found out that I could build a bias lighting system for about $100 with a Raspberry Pi and no soldering, I jumped on it. It started with this video guide from DrZzs:

The magic behind it is Hyperion, an open source system for doing bias lighting based on an HDMI input source. I had no idea something like that existed, and now it’s glued to the back of my TV. As a bonus, there are a bunch of fun effects so you can use it as an ambient RGB light when your TV is off. I can also control it with Home Assistant!

Here’s what I wound up buying:

LEDs

DrZzs recommends these 150 LED/5m strips but I bought these 300 LED/5m version. In retrospect, the 150 LEDs would probably have been better because they are lower power and we don’t need very high resolution for this. I have a 55″ TV and used 212 LEDs from the strip.

Power Supply

Someone on reddit suggested a 20W power supply to go along with the higher power usage for a 300 LED/5m strip. I prefer the 10W power supply that DrZzs linked to for two reasons:

  1. The 20W supply requires you to wire your own power cord in
  2. The 10W supply comes with a barrel connector that makes plugging everything in easy

Remember, I’m threading the needle between “excited for bias lighting” and “too complicated to bother with,” so convenience matters.

I was worried that the 10W supply wouldn’t be enough for the 300 LED strip, and technically it isn’t. The conservative estimate for amperage is 0.06A/LED, so 300 LEDs could need as much as 18A. Also I followed DrZzs tip around 15:30 to power the Raspberry Pi from the same power supply, which recommends 2.5A. However, the more realistic calculation for LEDs is 0.02A/LED and 350mA for the Pi, so my final power estimate is about 4.5A. Plenty of power from a 10A supply.

HDMI Capture

The HDMI Capture Loop that DrZzs linked to in his description is sold out on Amazon so I went with a solution I found in the reddit comments. DrZzs’s recommendation is a capture loop – it sits inline with the HDMI between the source and TV, and sends a capture to the Pi via USB.

I got this 4K HDMI splitter and this 1080p USB capture card. It means more HDMI cables behind my TV, but also helps with my cable management because my TV’s inputs are separate from the TV. So the HDMI splitter sits in my media console while the Pi is on the back of the TV.

Misc

These little wires are actually a huge part of what makes this project accessible to me. When I’ve seen LED projects before, you’ve had to solder wires to a ESP32 and then to the tiny contact strips. What I like about this project is that I grab a female-to-female wire, plug it in to a GPIO pin on the Pi, and plug it on to the male connector of the LED strip. Done. Connected.

These corner connectors are great for making the LED strips sit flat. When I was roughing the LEDs in, I just made loops at the corners to get a bend, but these make it look much better. It took me a few tries to figure out that the corners and strips go under the pins, but I got there eventually. Also the pins weren’t perfectly aligned with the copper bits on the LED strip, at least until I nudged them into place.

Gotchas

I did run in to some problems, or at least went off script from the tutorial video. First, instead of using Raspbian and manually installing Hyperion like in the video, I used a Pi image put out by Hyperion called Hyperbian. Download the image, flash it to an SD card, set up the WLAN and it’s good to go.

Second, the LED strips on the Pi just did not work right when I connected the data pin to GPIO 18. I think DrZzs glosses over the importance of having the Pi and the LED light strip share a common ground. The lights were fuzzy and not powering all the way until I grounded the LEDs to the Pi. Just like I have a wire going from the GPIO 18 pin on the Pi to the data pin on the LEDs, I added another wire going from the ground pin on the LED strip to a ground pin on the Pi. Suddenly everything lit up perfectly.

The last gotcha was that there was quite a bit of lag between the TV screen and the LEDs. If you don’t notice latency, don’t go looking for it. You will be cursed with the knowledge that it’s there. I’m not going to mention it to my family so they don’t notice it. I did make good progress on reducing the latency though, and I’ll outline how in a future post.

This was a fun project and I’m much more comfortable with LED strips now. I’m looking for other places in my house to install them. It’s also been fun to see which scenes have been really enhanced by the lights. I’ll end with one of my favorite examples so far, from Mary Poppins:

My First Time on the Other Side of the Screen

I messed up rules, I forgot what I said, and didn’t add any of the flavor descriptions I had planned. But, people had fun so that’s OK. After three years of playing Dungeons & Dragons, I survived my first session as a Dungeon Master.

Once a year my employer gathers all our distributed workforce for a week of in-person work. My coworker Payton organized some groups to play in the downtime we have some evenings. He wrote the scenario, provided pre-gen characters, dice, pencils, everything! Payton also asked people if they wanted to play, DM, or “play but could DM if needed.” I chose option 3.

The game was a one-shot scenario based on Stranger Things called “Unusual Things” for a group of level 2 characters. His description:

Nothing much happens in the mountaintop town of Hawkurns, where the populace mines magical crystals and their kids to get into all kinds of mischief. Recently, however, people have been disappearing, and no one knows why. Rumors of unusual things are everywhere. Can you help solve the mystery before something worse assails this small town?

My players solved the puzzles, found the big bad guy and defeated him! Some of the townsfolk even survived!

Aside from Payton’s organizing, two things helped me a lot for my first time on the other side of the DM screen.

/r/DMAcademy has a lot of good discussions and tips. It also presents a wide array of experiences so I felt like I could handle the weird things my players did. The players still threw me for a loop, but the important thing was that I was fooled into thinking I could handle it.

The other big help was my friend Chris Salzman’s podcast Roll for Topic, where he and Andy Rau roll a d20 to decide what to talk about with their guest GM. (Skip episode 20 if you’ve never listened before, it’s an off-format episode.) Chris runs the 5th Edition game at my coworking space as well as a Blades in the Dark game I’m in with Andy on Roll20. I also recommend listening to your own DM’s podcast if they have one, just to find out how much you messed up their plans.

Chris’ sign-off for the podcast is “Remember, if your players are having fun you’re a great GM” which is the best advice I got.

(The photo at the top is from a different game I was in with maps and miniatures. I forgot to take any pictures of our game, and my hand-drawn maps with beer caps for enemies was much less photogenic)

Brent Simmons on why he’s not adding algorithmic timelines to NetNewsWire, his RSS reader:

These kinds of algorithms optimize for engagement, and the quickest path to engagement is via the drugs outrage and anger — which require, and generate, bigger and bigger hits.

This is what Twitter and Facebook are about — but it’s not right for NetNewsWire. The app puts you in control. You choose the sites and blogs you want to read, and the app reliably shows you their articles sorted by time. That’s it.

Update: Brent also wrote a follow-up highlighting these tweets:

JavaScript is not available.

1. and 2. mean it’s not the algorithm’s fault. There’s no way to write an engagement algoritm that doesn’t select for outrage and anger. But 3. means anything that incorporates such an algorithm actually makes us worse people.

Imagine messing up a laptop camera so badly that a return to normalcy is literally a headline feature

XPS 13 2019 review: One small move made Dell’s best laptop even better

Dell gave its XPS laptop an overhaul last year, but 2019 is all about refinement. Announced at CES, this year’s XPS 13 laptop looks largely the same as the 2018 model, but it has a few new and improved features that attempt to right some of the wrongs of the previous generation.

Loneliness and Junk Social Media

I keep thinking about this video Jason Kottke linked to:

The health effects of loneliness are well documented, but I’d never actually thought of it in biological terms. We hunger for connection in the same way we hunger for food, and we hurt when we are shunned in a very physical way.

Jason’s analogy to sugar really got to me:

Like our affinity for sugary foods, the feeling of loneliness turns out to be another one of those things that served humans well when we lived in small hunter-gatherer groups tens of thousands of years ago but often works against us in our individualist modern world.

I think that a lot of online social networking is the social equivalent of junk food. Call it junk socializing. Everyone knows a “Facebook friend” isn’t a friend; otherwise why would it be qualified with “Facebook?” But we still feel like we’re connecting with people when we like their pictures on Instagram.

If Facebook, Twitter, and Instagram are bad, then podcasting, YouTubers, and streamers are worse.

"how it feels to listen to podcasts" with image of a guy socializing with an ad

After listening to a “two guys talking” podcast I feel like I’ve just hung out with some friends, even though the podcasters don’t know me at all! Maybe I feel less lonely, in the same way I feel full after eating a Quarter Pounder w/ Cheese: I sated my hunger but not in a sustainable way.

As a society, we need to work a lot on social isolation. I’m not 100% in on Cal Newport’s “delete your social media” proscription, but it’s starting to make a bit more sense every day.