3D Printing Using Flexible Filament (TPU)

Through my short journey with 3D printing, I’ve spent a lot of time reading through the 3D Printing SubReddit and something that I found interesting was people talking about printing using flexible filament (TPU). While I didn’t have a real use for printing squishy things, I was curious. A few weeks ago, I purchased a roll of SainSmart Flexible TPU filament to see if I could print.

The forums and other references indicated that printing flexible filament was difficult because pushing the filament through the printer was like pushing a wet noodle! Some people had said that they printed with the stock printer, others said that for best results they modified the printer into a direct drive system. I was up for the challenge!

Before I started, I had already made the following modifications to my printer:

  • Replaced the extruder with an all metal one

  • Replaced the Bowden feed tube with a Capricorn one. The basic gist behind this change was that the tighter tolerances on the tubing doesn’t allow the filament to wiggle around and bunch up. In addition, when switching filaments, I don’t have to purge as much filament as very little gets stuck in the tube.

  • Added a filament guide that I printed. This should create a smoother path for the filament.

  • Added a filament runout sensor with guide. The main goal with this modification was to be notified when filament runs out so I can change it during a print. It also really helps feed the filament (with the flexible filament, I have to push down the microswitch to feed it).

  • Added a filament holder with bearings. This took awhile to print, but has been great. On my first flexible print, I noticed the extruder was having trouble pulling the filament because the roll wasn’t spinning freely. I helped things along, but realized that reducing friction would be a big help.

  • Leveled the bed manually and with the BLTouch.

Other than that, I’m using the stock magnetic bed that I cleaned.

My Cura settings are pretty straightforward.

For the TPU material settings, I used:

  • Print temperature: 215°C
  • Build plate temperature: 50°C

Profile settings:

  • Infill density: 10% (I was printing something squishy)
  • Print Speed: 20 mm/s (I’m going to try increasing this as things worked well)
  • Regular Fan Speed: 0%
  • Regular Fan Speed at Layer: 1
  • Material Flow: 110%
  • Enable Retraction: Off

That’s really all there was to my settings. Since I’m sometimes not the most adult person, I thought it would be funny to print a Poop Emoji for my son. It was squishy (10% infill) and I printed it at 50% of the original size keeping the print time to about 2 hours. My son absolutely loved the print. I was amazed at the print as I did it on the first try with my cold printer.

I’m going to keep experimenting with TPU and try to figure out what else I can print. I have no need for flexible filament, but why not print more stuff!

Installing a BLTouch on an Ender 3 Pro

I recently purchased a Creality Ender 3 Pro 3D printer as an “upgrade” to my Monoprice Select Mini Pro printer. There were a few reasons I decided to do this:

  1. Larger print area. While the Select Mini Pro is a great little printer, I am limited to what I can print and I’ve started getting interested in printing lithophanes which can get a bit larger than the printer can handle.
  2. Automatic bed leveling has no fallback. The Mini Pro uses an inductive sensor for automatic bed leveling and if that fails, I either have to replace it or am pretty much out of luck (someone has posted instructions on how to level the bed if the sensor fails, but it is a tad cumbersome).
  3. No ability to switch out the build plate for glass or another material. I’ve done a lot of reading about 3D printing and people swear by glass build plates and the Select Mini Pro doesn’t make it easy to add one. A shim has to be added to the Z axis limit switch and then you have to figure out the bed leveling.
  4. Limited to what filament types can be used. The printer has a maximum build plate temperature of 70°C and limited nozzle temperature which can limit the filaments used. Also, I’ve read that while some people have had success with TPU (a flexible filament), it may not work so well.
  5. No ability to modify the firmware for different features. The Ender 3 runs the Marlin firmware which is open source and can be easily modified.

In any case, I’ve read about manual bed leveling and while doable, it seems like a lot of work and I like easy! After setting up the printer and running it for a few days, I decided to install the BLTouch automatic bed leveling probe. In the weeks leading up to setting up the printer and the probe, I had read numerous articles and watched a number of videos on the subject, so I thought I was prepared for it. Parts of the setup seemed a bit daunting, but nothing I couldn’t handle.

The first step to installing the probe was printing out a mount for it. Thingiverse has a number of options. I settled on this mount as it was adjustable. Printing it and attaching the BLTouch was quite easy; I didn’t have the right size M3 screw, so I had to cut off a longer one.

After attaching the BLTouch, I had to run the extension wires through the sleeve that had the other wires. This was a little bit of a pain. The only mistake I made here that bit me later on is that the extension cable became disconnected and the BLTouch failed to operate causing the nozzle to hit the build plate. Oops. The lesson here was to hot glue the connectors together so that any jiggling of the cables wouldn’t cause them to disconnect. The second lesson is to always make sure the BLTouch performs its self test when the printer powers up.

Fhew, I’m exhausted just writing that up! After the wires were run, I had to attach them to the motherboard. The BLTouch has 2 connections; the first is done through a pin 27 connector and is just unplug the LCD cable, plug in the connector, plug in the LCD again and attach the 3 wires from the BLTouch making sure the orientation was correct by verifying the labeled pins were attached to the correct, color coded wires. The second part of attaching to the motherboard was to replace the Z axis end stop. The extension cable I bought had a non-keyed connector that just plugged in. Unfortunately it wasn’t a secure connection so I used hot glue on it the first time I connected it. On my second poke at the motherboard (due to troubleshooting the connector that came loose as I mentioned earlier), I decided to just cut the wires on the Z axis end stop, solder on the extension cable and use some heat shrink tubing. I had to make sure the white wire was towards the front and the black wire was towards the back. This was a much better connection and has less of a chance of coming close. After I buttoned up the motherboard, it was onto the firmware.

Once I initially got the printer setup and running properly, I upgraded the firmware mostly to know that I had thermal runaway protection and had the latest changes. Compiling the firmware was straightforward and explained in various posts and videos. Most of the posts talk about installing the initial firmware with a bootloader using an Arduino board. As I don’t have any Arduino boards around, I opted for installing using a Raspberry Pi 3B that I purchased to run OctoPrint. I used this guide which was easy to follow and perform the initial install. The printer didn’t come with a boot loader which required the extra steps to install the firmware the first time; why this was done, I have no idea. Over the course of a few days, I managed to pick the options I wanted for my firmware. Unfortunately my Creality 1.1.4 board doesn’t have much space on it, so I had to disable SD card support. This wasn’t a big deal as I do all my printing through OctoPrint. Using the base Ender 3 Marlin 2.0.1 example, I made the following changes:

Old Configuration.h:

    #define SHOW_CUSTOM_BOOTSCREEN
    #define BAUDRATE 115200
    #define CUSTOM_MACHINE_NAME "Ender-3"
    //#define BLTOUCH
    #define NOZZLE_TO_PROBE_OFFSET { 10, 10, 0 }
    #define MIN_PROBE_EDGE 10
    #define Z_PROBE_SPEED_FAST HOMING_FEEDRATE_Z
    #define Z_CLEARANCE_DEPLOY_PROBE   10 // Z Clearance for Deploy/Stow
    //#define Z_MIN_PROBE_REPEATABILITY_TEST
    //#define PROBING_FANS_OFF          // Turn fans off when probing
    //#define AUTO_BED_LEVELING_BILINEAR
    //#define RESTORE_LEVELING_AFTER_G28
    //#define LEVEL_BED_CORNERS'
    //#define Z_SAFE_HOMING
    #define SDSUPPORT
    //#define NOZZLE_PARK_FEATURE
    //#define SLIM_LCD_MENUS

New Configuration.h:

    //#define SHOW_CUSTOM_BOOTSCREEN
    #define BAUDRATE 250000
    #define CUSTOM_MACHINE_NAME "Ender 3 Pro"
    #define BLTOUCH
    #define NOZZLE_TO_PROBE_OFFSET { -44, -16, 0 }
    #define MIN_PROBE_EDGE 44
    #define Z_PROBE_SPEED_FAST HOMING_FEEDRATE_Z / 5
    #define Z_CLEARANCE_DEPLOY_PROBE   15 // Z Clearance for Deploy/Stow
    #define Z_MIN_PROBE_REPEATABILITY_TEST
    #define PROBING_FANS_OFF          // Turn fans off when probing
    #define AUTO_BED_LEVELING_BILINEAR
    #define RESTORE_LEVELING_AFTER_G28
    #define LEVEL_BED_CORNERS
    #define Z_SAFE_HOMING
    //#define SDSUPPORT
    #define NOZZLE_PARK_FEATURE
    #define SLIM_LCD_MENUS

Old Configuration_adv.h:

    //#define BABYSTEP_DISPLAY_TOTAL          // Display total babysteps since last G28
    //#define BABYSTEP_ZPROBE_OFFSET          // Combine M851 Z and Babystepping
      //#define BABYSTEP_ZPROBE_GFX_OVERLAY   // Enable graphical overlay on Z-offset editor
    //#define ADVANCED_PAUSE_FEATURE

New Configuration_adv.h:

    #define BABYSTEP_DISPLAY_TOTAL          // Display total babysteps since last G28
    #define BABYSTEP_ZPROBE_OFFSET          // Combine M851 Z and Babystepping
      #define BABYSTEP_ZPROBE_GFX_OVERLAY   // Enable graphical overlay on Z-offset editor
    #define ADVANCED_PAUSE_FEATURE

Downloading the firmware after the first install was easily done through OctoPrint without having to install jumper wires and remove the motherboard. It is so easy that I made changes, recompiled, and uploaded new firmware a few times.

Was I done yet? Of course not! I hadn’t even leveled the bed! Some guides say to add a G29 command to Cura settings which runs the auto bed leveling on every print. Auto bed leveling is slow and that’s just a waste of time. So I decided that I’ll just level the bed every few days. I started up the printer, did an Auto Home, verified that touching the probe sent the hot end up (if it didn’t, I would have stopped the printer). Using OctoPrint, I sent

G28; auto home

G29 L50 R150 F50 B150 T V4

to the printer to start the auto bed leveling. This sets a bit wider grid than the default G29 command. Then I sent a

M500

command to store the settings.

M501

to read the settings back.

Was I finally done? Nope. The next piece in the BLTouch configuration was to properly set the Z Probe offset. This is the distance between the bottom of the nozzle and the bottom of the probe. The probe is, obviously, slightly higher than the nozzle (so that it doesn’t drag). Most guides say to start at zero, print a first layer, adjust the Babystep Z, use the M851 command and then store the setting. I did this and after a number of adjustments, got things printing quite well. However, after reading this manual bed leveling guide, I realized that there was a slightly easier way. Basically if I have an object of a known size, i.e. a cube that I use calipers to measure the height, raise the Z axis (using the controls on the printer), put the cube directly under the nozzle and adjust the Z axis so that the nozzle barely touches the cube, I’d know exactly the Z Probe Offset. If the cube is say 15 mm tall and the Z axis shows 15.88 mm, I’d set the Z Probe Offset to be -0.88 mm. So much less guess work in this. Don’t forget to store the settings after setting the Z Probe Offset. Since my settings are pretty good using the Babystep Z (with my firmware options simply called Z Probe Offset), I haven’t actually tried using a cube of known size to set the offset.

Lastly, I used the Level Bed Corners menu option I enabled in the firmware to manually level the bed as well as I could and then re-did auto bed leveling so that the firmware would have less to compensate. In addition, I used the OctoPrint Bed Level Visualizer plugin to see how close I got to a level bed. I know that the bed is going to warp at some point, but for now I have a pretty flat and level build plate.

Even after I had everything setup, I discovered that my bed was much higher on the right side even though when I had manually leveled it, it was pretty good. It turns out that my X gantry was loose so the right side was higher than the left. There are plenty of videos that explain how tighten this; don’t follow the Creality video as it is pretty useless.

To summarize, here are my tips:

  • Use hot glue on the extension cable connectors.
  • Splice in (and solder) the extension cable connectors (black and white wires) to the Z axis end stop.
  • Compile your own firmware so that you can get the settings you want.
  • Don’t put the G29 code in your Cura profile as that is a waste of time; level your bed every few days if needed.
  • Manually level your bed after everything is setup and then re-run auto bed leveling.
  • Check to make sure that your X gantry is tight.
  • Use a known height object, i.e. a cube to set the Z Probe Offset.

Of course, follow my directions/tips at your own risk. I’m not an expert at this and am not responsible for any problems arising from your use of this information! I thought I had everything working well until I managed to crash the nozzle into my bed (I now have a nice dent in the bed that I’ll bed around to replacing someday).

Good luck as getting BLTouch working properly isn’t an easy task, but hopefully worth it in the long run!

Addicted to 3D Printing

I wrote about 3D printing a few months ago and at the time was using a DaVinci 3D printer. Unfortunately the printer didn’t last long and despite my best efforts, I couldn’t get it to keep printing; it was making a grinding noise while feeding and the cost to replace the hot end with shipping was about 1/3 the cost of a new, low-end printer. I decided to cut my losses and purchased a Monoprice Select Mini Pro. This printer cost me about $180 and when it arrived, I was printing pretty quickly.

The Select Mini Pro, unlike the DaVinci printer, requires a bit more tinkering to go from a model to a print. Printing requires a process called slicing and one of the more popular programs is called Cura which exposes a ton of options to control the print. As I didn’t want to have to copy the files to the SD card to print, I setup OctoPrint running on a Raspberry Pi 2. This put the printer on the network and made it easier to monitor the printer.

I’ve been printing things like crazy and as I may have already mentioned, I’m addicted! I added a camera to the Raspberry Pi and can now see the progress of my prints without being next to the machine.

With all the success I’ve had, I’ve also had a number of failures. Sometimes these failures have been my fault (I ran the nozzle into the bed and damaged the bed and nozzle and when replacing the nozzle I didn’t screw it in when everything was hot leading to oozing of material) and sometimes not my fault (the coupling holding in the feed tube broke).

I’ve learned a number of things about this hobby with the first thing being that everyone has a different opinion on how to fix things! Something that is a recurring topic on Reddit is how to get the prints to adhere to the print bed with lots of different suggestions. I started using blue painter’s tape and had great success with that. Then I moved my printer to the garage and found I needed to use a glue stick in addition to the painter’s tape. The day after I was printing some whistles, the prints stopped sticking to the bed no matter what I tried. Then it dawned on me that the garage temperature dipped a few degrees; this was enough to cause a problem. I returned from Home Depot with some rigid polystyrene foam insulation and built an enclosure for the printer. With this enclosure, I decided to try printing right on the print bed without the tape; this worked quite well now that I had more control over the temperature of the print bed. With every problem, there is a solution, but it requires some research and a lot of trial and error!

IMG 2435

The bottom line is that 3D printing is a hobby and if you’re not comfortable futzing and repairing things, then the consumer grade printers are definitely not for you!

Silence Unknown Callers – Great in theory, problematic in reality

We’ve all suffered with telemarketers and scammers calling our phones and have had limited success in blocking those calls. In iOS 13, Apple added an option to “Silence Unknown Callers” which sounds like a great feature on the surface. I turned the feature on and then quickly turned it off as I realized that there are a number of cases where I need to receive calls from unknown callers. Some might be thinking that it is fine to let them goto voicemail, but it’s not that simple.

A few months back, we stopped at the scene of an accident to help out. The injured party wanted to call a friend, but her phone battery was almost depleted, so a bystander let the person use her phone. The friend would have gotten a call from an unknown number and with the silence option, it would have gone to voicemail and potentially ignored. Granted some people ignore unknown callers anyway, but the option wouldn’t have given the friend the opportunity to answer the phone.

If the emergency situation is too extreme in your thinking, another case arose for me this past Wednesday. My son went on a field trip and the bus bringing the students back to school was late, so he used his teacher’s phone to call me. I received the call from an unknown number right as I was about to get to his school to pick him up. If I hadn’t gotten the call and my son didn’t leave a message, I would have been sitting around wondering where he was. Eventually I would have gone into the school office to see what was up, but that would have been after waiting awhile.

Without using this feature, I unfortunately have to live with the telemarketers and scammers. Who knows if the STIR/SHAKEN will work to block many of these calls. We can only hope. One thing that could definitely help which I have no idea why it never got implemented is caller name as part of caller ID on cell phones; I’ve wondered this for years.

Fixing a broken printer

Yesterday my wife came home and said she picked up a free, broken printer and wanted to know if I could take a look at it as she’d love it for her classroom. It was an EPSON ET-4750 which is the big brother to the EPSON ET-2750 that we’ve had for a year and been quite happy with the purchase.

Of course, I said sure I’d take a look and asked if I got it going could I swap out printers? She didn’t hesitate and agreed. The problem, I was told, was that the printer wouldn’t feed the paper. I opened up the back of the printer which has the feed mechanism and saw some broken plastic. Upon further inspection, I saw the broken gear where the plastic was supposed to go. Uggh, I thought. I looked at the back of our printer and it had a similar door to get to the feed mechanism, so I took it off in hopes that I could just replace it and be done. No such luck. However, looking at the broken gear I saw that our printer had the same gear on the feed mechanism. I was able to pull off the gear and put it on the broken printer and it fit! So at least I got a new printer for me 😀 That, of course, wasn’t going to help my wife.

As I indicated in my post about 3D printing, I’ve always envisioned just being able to print spare parts and be able to prolong the life of things. A search online didn’t find the gear I needed, but I did find sites that could generate gear files. I asked my wife to count the teeth on the gear and I started playing around with a site that let me enter parameters and made the gear. I tried a few parameters and tried to make the gear look what I had. I printed a test gear (the site gave me an STL file that I needed to modify a bit) and while not perfect, I thought I could make it work.

After a bit of work with TinkerCAD, I printed a working gear. While it isn’t an OEM part and could be a little more precise, I’m pretty impressed with what I made. Part of the issue may just be that the 3D printer isn’t precise enough to make a true replacement.

Gear on paper feed

Gear

I’ve published my work on TinkerCAD for others to enjoy.

If you find this helpful, please let me know. Also if there is a way to start convincing companies to publish STL files for parts and you have ideas on this, let me know.

3D Printing comes to my home

Years ago when 3D printing started to become mainstream, I thought the technology had its place for prototyping, low volume manufacturing and printing replacement parts for appliances and other things. In my ideal world, instead of companies selling replacement parts for say a refrigerator, they could license their designs and parts could be printed at a local hardware store or the like using a variety of materials. Companies could still make money on the parts, but they wouldn’t have to stock them or ship them. In addition, parts for discontinued products could be made prolonging the life of products.

As the price of 3D printers came down and became easier to use, they started appearing in schools and homes. My son started making trinkets in a class he took and I dismissed the low end 3D printers as toys. While they may have been relatively inexpensive and easy to use, I couldn’t imagine a real use for one despite my son asking if we could get one a few times.

A few weeks ago, a neighbor/friend of mine gave us a 3D printer that he had for awhile, but rarely used. He gave us a XYZ Printing DaVinci Jr. 1.0. This printer is a very consumer friendly printer with very few settings to mess up, I mean configure. One downside of the printer is that you have to purchase the filament for printing from the company as there is an NFC chip in each spool that tells the printer some parameters about the filament (yes, there are hacks to get around this). For someone just starting in 3D printing, I saw that as a plus. Getting it setup should have been easy, but was quite frustrating. I made an adjustment to the Z offset and my son suggested we use blue painter’s tape on the bed which worked quite well.

Now that we had a 3D printer in the house, I didn’t want to make trinkets; I wanted to design and make things that we’re useful. My son showed me TinkerCAD that he uses to make things and I took to it pretty quickly. The last time I touched a CAD program was 25 years ago in college and that was quite painful. TinkerCAD is easy to use and I got used to looking at designs in 3D.

I’ve spent a few weeks designing things and printing them; my skills are getting better and I’m not sure there is an end in sight! I didn’t know that I could solve so many problems by making parts. I also didn’t know that I had so many problems before I started looking for them!

The sky is the limit and I’m going to keep designing and making things!

Here are a few pieces that I’ve made:

Eagle Scout Award Holder

With my new Scout uniform, I didn’t want to put holes in it with pins. On formal occasions, I wear my Eagle Scout pin and I wanted a solution so that I didn’t have to pin it to my uniform (I can never get the holes right when I put things back on). This design has a place for me to put the pin through and then I put a magnetic name tag backing on it.

EagleAwardHolder

Knife Sheath

When we go camping, I have a separate knife I use for cooking. I had this wrapped in cardboard, but I wanted something more permanent.

KnifeSheath1

KnifeSheath2

Outdoor Light Stake

On our front walkway, I have low voltage landscape lights. One of the lights I’ve knocked over a few times breaking off the stake that holds it. The lights aren’t made any more and getting a replacement stake could be hard. I’ve fashioned a few holders, but they haven’t worked well. So I decided to make my own. I printed this with filament that I’m sure will breakdown at some point (it isn’t outdoor rated), but I can print another one later with the correct filament when that happens. This is exactly one of the uses I described at the beginning of this post; replacement parts. Instead of pounding the stake into the ground, I dug a hole, put it in and then packed dirt around it. It seems to be holding up.

LightStake1

LightStake2

Subaru Impreza Phone Cable Holder

My car has the USB ports for connecting my phone in the center console where I think they expect you to place your phone. This isn’t convenient for me and I’ve always put my phone in front of the gear shift and snaked a cable from the console to that spot. In order for the cable to stay there when my phone isn’t plugged in, I had a wire that I jammed in between two pieces of plastic. I could have glued a holder there, but I didn’t want to do that. So I designed a hook that I was able to push into the spot where I had the wire. It’s been holding up quite well and almost looks like it belongs.

SubaruCableHolder2

SubaruCableHolder1

Luminoodle Light Hooks for a tent

I purchased a Luminoodle LED strip light for camping and found that there weren’t enough hooks to easily set it up in my tent. I came up with a design to hold the lights and attach it to my tent. It took several iterations to get exactly what I wanted, but this may be my best work yet. The design is simple, but works quite well. I printed a bunch of them and put them in my 2 tents so that they just stay with the tent.

LuminoodleTentHooks4

LuminoodleTentHooks3

LuminoodleTentHooks2

LuminoodleTentHooks1

Fixing the Vizio SB36512-F6 Soundbar

For a number of years, I’ve had a Vizio 5.1 soundbar which worked reasonably well. It was connected to my TV via an optical cable. The only real issue I had with it was that I had to use an IR remote to control the volume. Luckily the Apple TV remote has the ability to send IR commands to control the volume. This worked OK, but always required me to aim the remote at the soundbar and press multiple times if I wasn’t aligned with it. Last year when Apple said it was adding Dolby Atmos to the Apple TV 4K, I was intrigued. While the room my TV is in isn’t ideal with Atmos, I wanted to give it a try. My current soundbar, of course, wouldn’t handle Atmos. I saw the Vizio SB36512-F6 on sale at Costco and picked it up.

Setup of the soundbar was simple; plug the HDMI cable from the Apple TV into the soundbar and then the soundbar into the TV via ARC. This would allow the soundbar to handle Atmos. In addition, since it was connected via HDMI, I could use HDMI-CEC to control the volume without the need for setting up IR on the remote (I could also use the Apple TV remote control center widget on my iPhone or iPad to control the sound which is kind of neat). For the most part, this setup worked and when Atmos support came to the Apple TV and Netflix, I was able to use it; I couldn’t tell much of a difference as there is still not a lot of content that supports it. However, we kept having problems where no audio would come out of the soundbar and it required us to power cycle everything or quit an app and start over. It was annoying to say the least. I went through a number of firmware upgrades and patiently waited for Vizio to fix the issue, but it never happened.

I reached out to Vizio support and they gave me some suggestions, but all of them would result in not having the ability to use Atmos (which was one of the reasons for the soundbar). After dismissing their suggestions for awhile, I finally decided to try one of the options. In the Apple TV’s audio settings, instead of automatic, I chose Change Format to Dolby Digital 5.1. Ever since I did that, audio has worked perfectly. While I lost Dolby Atmos, I also lost the frustration of not having audio. What I suspect is happening is that when change format is selected, the Apple TV is always outputting the same audio stream type and the soundbar doesn’t have to figure out how to decode the audio; with automatic, the soundbar is sometimes unable to properly decode the stream when a show or movie starts. It then gets confused and just doesn’t play anything.

If you have an Apple TV 4K and are having audio problems with a soundbar, I’d suggest trying the Change Format setting. It is really too bad that Vizio can’t figure out how to fix this issue as I’d like to sometimes play Atmos content without having to switch the setting.

Review: Motorola Talkabout T460

Many years ago I purchased FRS (Family Radio Service) radios commonly called walkie talkies. I chose the Motorola T5000 because it came with 4 radios and the price was right; I know I didn’t spend much time researching them. The radios have served me well over the years. The radios operated on both FRS and GMRS (General Mobile Radio Service) frequencies. Certain frequencies (the radios have specific channels that correspond with frequencies) are FRS only, some are GMRS only and some are available on both. FRS is unlicensed whereas GMRS requires a license. Since I don’t have a GMRS license, I’ve always stuck to the FRS channels. I suspect that many people just picked a channel and used it not realizing that they were violating FCC regulations.

In 2017 the FCC adopted changes to FRS and GMRS which increased the maximum output for FRS radios (from 0.5 W to 2.0 W) and made available certain channels that were GMRS only to FRS users. They basically acknowledged that people were using the FRS/GMRS radios with no regard to which channels were being used.

The rechargeable batteries on the Motorola T5000 radios I owned had long died and with my push to have all my devices rechargeable via USB, I decided this past spring to replace the radios. The radios also had a maximum output of 0.48 W on the GMRS channels (now available for FRS) which meant that the range of the radios was quite limited.

I liked the styling of the newer Motorola Talkabout radios, so I picked up a pair of Talkabout T260 radios. I used the radios on a Scout trip and they worked quite well, but there was a place that I lost contact with another leader that was at the front of the hiking group. We also used the radios on a cruise ship and they worked pretty well across several decks even with all the steel that blocks signals. Even though the radios performed decently, I decided to do some more research on the radios. The radios are advertised with a 22 mile range; that, of course, is in ideal conditions and never happens. The manual has a chart of channels and shows the maximum output power on the different frequencies. This is quite misleading as I soon discovered. All FCC registered devices have information available on the FCC’s website including test reports showing the actual power output. Doing a search of the FCC ID AZ489FT4929, I discovered that the maximum output of the radio was 0.8 W which is far lower than the legal maximum.

After more research, I found the older manual for the radio and it appears that Motorola changed a few pieces when the new FCC regulations took effect including the chart; the old chart showed which channels were FRS and which were GMRS. The new chart was basically a waste because all the channels could be used by all users; some of the channels would have higher output. Motorola did respond to my claim that the manual was misleading by saying that the chart in the manual wasn’t there to indicate how much power the radios actually put out, but to show that it complied with the FCC maximum.

This newfound knowledge kind of bummed me out, so I went back to the drawing board and went through all the Motorola Talkabout radios (I like the design of the radios over other brands) and looked up the FCC ID of each one looking for the radios that had the highest output power (up to the legal limit). If my information is correct, the Motorola Talkabout T460 which has an FCC ID of AZ489FT4924 has a maximum output power twice that of the T260 (1.7 W). While I don’t expect to ever get the 35 mile range advertised with these radios, I at least could have radios with the highest power available.

I purchased a pair of the T460 radios and have now used the radios on a couple of trips and they are far superior to the T260 (and the old T5000 ones I had before). (While I did purchase these radios, Motorola generously sent me another pair of the T460 radios in response to my letter to them about the T260 manual). Specifically the T460 has an analog control for the volume making it easier to turn the units on/off and control the volume. They also have a feature called VibraCall which vibrates the radio the first time it receives a transmission if it hasn’t received a transmission in a certain period of time; this is handy if the volume is turned down or you happen to not be paying attention. Once the radio vibrates, you can ask the other end to repeat the message. The radios also have a weather radio, different call tones, and a few other features that are generally not of much use to me.

Pros

  • Highest power output of Motorola Talkabout radios
  • Water/splash resistant (IP54)
  • VibraCall
  • Analog volume control
  • Acceptable voice quality
  • Standby time allows for all day usage in my testing

Cons

  • Doesn’t come anywhere close to the advertised range

Summary

If you’re looking for relatively low-cost, unlicensed FRS radios, I think the Motorola Talkabout T460 are hard to beat. They have a decent range and a number of features that make them easy to use. Time, of course, will tell how durable they are and how good the range is in a variety of environments.

All In with Home Assistant

I’ve spent parts of the last 9 months playing with Home Assistant and have written about some of my adventures. A few weeks ago, I finally decided to go all in with Home Assistant and ditch my Vera. I bought an Aeotec Z-Stick Gen5 Z-Wave dongle and starting moving all my devices over to it. Within a few days, I had all my devices moved over and unplugged my Vera. Everything was running great on my Raspberry Pi B, but I noticed that the History and Logbook features were slow. I like looking at the history to look at temperature fluctuations in the house.

History graph

I had read that switching from the SQLite database to a MySQL database would speed things up. So I installed MariaDB (a fork of MySQL) on my Raspberry Pi and saw a slight increase in speed, but not much. Next was to move MariaDB to a separate server using Docker. Again, a slight increase in speed, but it still lagged. At this point everything I read pointed to running Home Assistant on an Intel NUC or another computer. I didn’t want to invest that kind of money in this, so I took a look at what I had and started down the path of installing Ubuntu on my old Mac mini which was completely overkill for it (Intel Quad Core i7, 16 GB RAM, 1 TB SSD). Then I remembered that I had read about a virtual machine image for Hass.io and decided to give that a try.

After some experimenting, I managed to get Home Assistant installed on a virtual machine running in VMWare on my Mac Pro. (A few days after I did this, I saw that someone posted an article documenting this.) I gave the VM 8 GB of RAM, 2 cores (the Mac Pro has 12) and 50 GB of storage. Wow, the speed improvement was significant and history now shows up almost instantly (the database is running in a separate VM)! I was so pleased with this, I decided to unplug the Raspberry Pi and make the virtual machine my home automation hub. There were a few tricks, however. The virtual machine’s main disk had to be setup as a SATA drive (the default SCSI wouldn’t boot), suspending the VM confused it, and the Z-Wave stick wouldn’t reconnect upon restart. After much digging, I found the changes I needed to make to the .vmx file in the virtual machine:

    suspend.disabled = "TRUE"
    usb.autoConnect.device0 = "name:Sigma\ Designs\ Modem"

(The USB auto connect is documented deep down on VMWare’s site.)

I’ve rebooted the Mac Pro a few times and everything comes up without a problem very quickly, so I’m now good to go with this setup. Z-Wave takes about 2.5 minutes to finish startup vs. 5 or 6 on the Pi. A friend asked if I was OK with running a “mission critical” component on a VM. I said that I was because the Mac Pro has been rock solid for a long time and my virtual machines have been performing well. I could change my mind later on, but I see no reason to spin up another machine when I have a perfectly overpowered machine that is idle 95% of the time.

What next? Now that I have more power for my automation, I may look at more pretty graphs and statistics. I may also just cool it for awhile as I’ve poured a lot of time into this lately to get things working to my satisfaction. This has definitely been an adventure and am glad that I embarked on it.

Dipping my toe in the world of Docker

A former co-worker of mine has talked about Docker for years and I’ve taken a look at it a few times, but have generally been uninterested in it. Recently with my interest in Home Assistant, I’ve decided to take another look as many of the installs of Home Assistant as well as Hass.io are based on Docker.

I’ve used virtual machines running on VMware Fusion for years with some Windows installs and some Linux installs. I’m very comfortable with Linux, but kind of dislike maintaining different packages. There are package managers that handle much of it for me, but then there are other packages that have special installations.

I had a few goals in mind for seeing if Docker could replace the current virtual machines I had running for Pi-hole and Observium. The goals were pretty simple that I wanted easy updates and be able to easily backup the data. In the Docker world, updates are dead simple and in many docker containers, the data is stored outside of the container making it easy to backup. As another goal, I wanted to be able to experiment with other containers to see what else I could add to my network.

With all this in mind, I started looking at how to setup Docker. Pretty quickly, I realized that Docker for the Mac was virtually useless for me as it didn’t handle all the networking that Docker running on Linux could. So that meant installing Docker on a Linux VM; that almost negated my goal of easy updates as I’d still have to update the virtual machine running Ubuntu. I could live with that if the rest of the setup was straight forward and didn’t have to remember how to update each container individually.

In order to make backups easy, I wanted to store the data on my Mac and not inside of the virtual machine. I’ve not had great luck with the VMWare tools for mounting volumes, so I decided to use CIFS (SMB) to mount a volume in Linux which works well except for the MariaDB (MySQL fork) Docker container. Not a big deal, I’d just add a cron job to dump the databases every few hours and store the dumps on the mounted volume. I added the following to /etc/fstab

    //myserver/account/Documents/Ubuntu /mnt/mediacenter cifs username=account,domain=WORKGROUP,password=password,rw,hard,uid=1000,gid=1000 0 0

I also had to turn on Windows File Sharing options on the Mac.

Windows File Sharing

The crontab is:

    30 */2 * * * /usr/local/bin/backup_mysql

with the backup_mysql file being

    #!/bin/sh
    /usr/bin/mysqldump -h 127.0.0.1 -u root -ppassword --lock-all-tables --all-databases | gzip > /mnt/mediacenter/backups/mysql/mysql_backup_$(date +"%m-%d-%Y-%H_%M_%S").gz
    find /mnt/mediacenter/backups/mysql/* -mtime +3 -exec rm {} \;

The next hurdle was dealing with IPv6; most people don’t care about it, but I’m not most people! IPv6 is quite complicated (at least to me), so that took a bit of experimenting to get it to work in Docker. For future reference, ndppd lets the virtual machine tell the world that it handles IPv6 for the Docker containers (basically).

So where was I? After getting the Linux VM setup, it was on to setting up my containers. With docker-compose, I could setup one file that was the configuration for all my containers. Now this was great as I could modify it and test out different containers. After a few days of work, this is the core of my docker-compose file. There are a few other containers I’ve added including LibreNMS, but this is basically what I have. The nginx-proxy is great as I just add DNS entries for each service and it handles SSL and lets me run multiple web services on the same machine.

version: "2.3"
services:
  nginx-proxy:
   image: jwilder/nginx-proxy
   environment:
      - DEFAULT_HOST=pihole.exmple.com
   ports:
     - "80:80"
     - "443:443"
     - "::1:8080:80"
   dns:
     - 10.0.1.1
   volumes:
     - /var/run/docker.sock:/tmp/docker.sock:ro
     - '/mnt/mediacenter/docker/certs:/etc/nginx/certs'
   restart: always
   networks:
      default:
        ipv6_address: XXXX:XXXX:XXXX:XXXX:1::2

  pihole:
    image: pihole/pihole:latest
    ports:
      - "53:53/tcp"
      - "53:53/udp"
    environment:
      # enter your docker host IP here
      ServerIP: 10.0.1.200
      WEBPASSWORD: ''
      DNS1: 127.0.0.1
      DNS2: 10.0.1.1
      DNS3: XXXX:XXXX:XXXX:XXXX:XXXX:XXXX:XXXX:XXXX
      # IPv6 Address if your network supports it
      ServerIPv6: XXXX:XXXX:XXXX:XXXX:1::3
      VIRTUAL_HOST: pihole.example.com
    volumes:
      - '/mnt/mediacenter/docker/pihole/pihole/:/etc/pihole/'
      - '/mnt/mediacenter/docker/pihole/dnsmasq.d/:/etc/dnsmasq.d/'
      - '/mnt/mediacenter/docker/pihole/pihole.log:/var/log/pihole.log'
      # WARNING: if this log don't exist as a file on the host already
      # docker will try to create a directory in it's place making for lots of errors
      # - '/var/log/pihole.log:/var/log/pihole.log'
    restart: always
    cap_add:
        - NET_ADMIN
    networks:
      default:
        ipv6_address: XXXX:XXXX:XXXX:XXXX:1::3

  mariadb:
     image: mariadb
     ports:
       - 3306:3306
     volumes:
       - '/mariadb/data/:/var/lib/mysql/'
     environment:
       MYSQL_ROOT_PASSWORD: password
     restart: always
     user: "1000"
     networks:
       default:
        ipv6_address: XXXX:XXXX:XXXX:XXXX:1::4

networks:
  default:
      driver: bridge
      enable_ipv6: true
      ipam:            
        driver: default            
        config:                
            - subnet: 192.168.0.0/24                
            - subnet: "XXXX:XXXX:XXXX:XXXX:1::/120"                

Phew, that was a lot of work to get things running. However, I’m pretty pleased with how things are working. I now have the ability to experiment with other containers and can restore my data easily if things go awry. Is Docker the answer to everything? Probably not, but it appears to handle this job well.