-
Auto Layout in a UITableViewCell with an image
Auto Layout is an amazing concept for developing iOS apps as it allows for an application to more easily look good across different devices. In addition it makes using dynamic type so much easier; as someone that wears glasses I always increase the type size on my devices and when apps don't take advantage of it, I get kind of annoyed. So when I develop apps, I try to use dynamic type and using auto layout makes things so much easier.
A common theme in apps I develop is to have a UITableView with a bunch of rows. The rows have different bits of text and the only sensible way to develop this is using auto layout. With the latest releases of iOS, a lot of code dealing with dynamic type, heights of cells, responding to device changes, etc. has been eliminated. However, there are still a few gotchas in making an app with a UITableView that behaves properly.
I'm going to go over the steps I used (and provide sample code) for how I handle this. I had a few requirements that make my implementation a little different than other tutorials on the web.
- Each cell has an image that would be at most 1/3 the width of the screen.
- The image must touch the top and bottom of the row.
- The image had to be at least a certain height.
- The image should attempt to have the aspect ratio of the original image. (Aspect Fit could leave white space; Aspect Fill could hide some of the image.)
- Images are loaded asynchronously from the Internet.
- Next to the image is up to 5 lines of text pinned to the top.
- Next to the image is 1 line of text pinned to the bottom.
- Each line of text could wrap to multiple lines.
- Increasing the type sized must resize the rows.
- The cell must have a minimum height.
- Rotating the device must work.
Writing that out sure looks more complicated than it was in my head!
I'm not going to go over the initial project setup, but will jump right into Interface Builder after I created a UITableViewCell with xib. Note that I don't use Storyboards and opt to create a separate xib for each view controller and each cell; Storyboards tend to bite me each time I use them as I like to re-use code as much as possible and by default the Storyboard for a UITableViewController puts the cell in the Storyboard making it harder to manage. I'm sure some would argue with me that Storyboards are great, but they just don't work for me.
- Create a new UITableViewCell with a xib.
- In the xib, add a UIImageView to the left side that is pinned to the left, top, and bottom of the cell.
- Set the UIImageView to Aspect Fill and Clip to bounds.
- Give the UIImageView fixed width and height constraints. (We'll change this later.)
- Add a vertical stack view pinned to 10 pixels from the UIImageView, pinned to 10 pixels from the top and 10 pixels from the right.
- Add 5 UILabels to the stack view. Each label should be set for "Automatically adjusts font" and a Font of Body. Set 1 or more of the labels to have 0 Lines so that it grows vertically. Also, set auto shrink to minimum font scale of 0.5.
- Add another vertical stack view pinned to the leading of the first stack view, 10 pixels from the right and bottom of the container and on the top to be >= 5 from the other stack view.
- Add a UILabel to this stack view. Set the font as above.
- Set the height constraint of the UIImageView to a priority of 250 (low).
- Add a UIActivityIndicator that is centered on the UIImageView (set the constraints).
- Create a UIImageView subclass that looks like this. The UIImageView uses the intrinsic size of the image in the absence of other constraints and we want more control of the height of the view.
class ExampleImageView: UIImageView { var minimumHeight: CGFloat = 0 override var intrinsicContentSize: CGSize { if minimumHeight == 0 { return super.intrinsicContentSize } return CGSize(width: 0, height: minimumHeight) } }
- Change the UIImageView class to ExampleImageView.
- Connect outlets for the 6 UILabels, the UIImageView (with the new class), the activity indicator and the height and width constraints on the UIImageView.
Your xib should look like this:
Time to move into the source of the table view cell. I'm only going to cover the interesting parts here. See my example repo at for the full example.
- Setup a variable for the cell width. This is going to be set through the view controller so that rotation changes can change the width of the image.
var cellWidth: CGFloat = 0 { didSet { maxImageWidth = cellWidth / 3 setImageWidth() } }
-
Home Assistant and Node-RED, automation perfected?
Earlier this year I started experimenting with Home Assistant and wrote some thoughts about it. It brought together enough pieces that I keep tinkering with it. A friend of mine had mentioned a project called Node-RED which is supposed to more easily link together IoT components and perform actions based on different inputs. At the time, I brushed it off as I didn't want to bother figuring out how to install it.
Fast forward to last week when I noticed that Home Assistant had a Node-RED add-on. Installing the add-on was quite easy and I was presented with a blank canvas. After a few minutes, I figured out how to make a simple sequence that took the state of a door sensor (at the time connected via my Vera) and turned on a light. Nothing too fancy, but I was able to hook it up, hit Deploy and test. It worked! This was light years ahead of the YAML based automations in Home Assistant and much faster to setup than Vera. I was hooked almost immediately. Could I convert all my automation logic to this? I spent the next few days trying.
Wiring some basic automations was quite easy as in this example that turns off lights in my bathroom if there is no motion. Node-RED resets the timer if there is more motion, making the sequence very straightforward.
Unfortunately not all my automations are this simple.
The most important sequences deal with my front and back motion sensors. I could just use something like the above sequence, but I only want the lights on at night, I want to be able to disable the motion sensors (for example on Halloween if we're not home, I don't want lights coming on), and if I turn on the lights manually I don't want them turning off a few minutes after there is motion. The tricky part here was to determine if the lights were manually turned on or triggered by a motion sensor. After a bit of experimenting, I decided to record the last time there was motion into a variable. Then when the lights turn on I check the variable to see how long ago it occurred. If it was less than 30 seconds ago, call it triggered my motion. With that, I was able to set on off timer based on how the lights came on. The one bit of "code" I had to write was to determine how long ago the motion was tripped.
var motionTimestamp = flow.get('lastFrontMotion') || -1; var difference = (Date.now() - motionTimestamp) / 1000; if (motionTimestamp === -1) { difference = -1 } msg.payload = difference; return msg;
The flow may look complicated, but to me it is quite readable.
The visual aspect of Node-RED makes it easier to setup automations, but calling automation simple is far from the truth as I've shown above. As a professional software developer, writing code doesn't scare me but for a hobby, this visual approach (with a little code as necessary) is much nicer. When I come back to this in 6 months, I have no doubt that I can read what is going on and troubleshoot as necessary.
For the last 5 years, I've been using my Vera to control everything using a plugin called PLEG which stands for Program Logic Event Generator. It has worked quite well, but it has been so long since I setup most of the automations, I have no idea what I did. PLEG, while functional, was a bit difficult for me to wrap my head around and it pained me every time I had to touch it. Also, when I touched it, I seem to recall having to restart Vera and wait only to find out that I needed to change something.
I'm so impressed with Node-RED that I've decided to see if I can move all my Z-Wave devices to Home Assistant using an Aeotec Z-Stick Gen5 plugged into the Home Assistant. The goal with this move is to speed up messaging; right now the Home Assistant polls (I think) Vera all the time looking for changes. This isn't very efficient and pressing a button can take a second or two to have the message reach Home Assistant. Will this work? I hope so!
I know that I'm just scratching the surface with this, but I am very excited over the prospects!
-
Non-secure network connections in Carnival Cruise's app
This past summer my family took a cruise on Carnival Cruise Lines to the Eastern Caribbean. There were a total of 17 of us and we had a good time. One of the suggested ways for everyone to stay in touch was to use the Carnival Hub App which is basically their goto app for up to date information on the ship which has a messaging component. For $5 per device for the cruise, it didn't seem all that unreasonable except that just about everything on the cruise costs extra!
The chat app, like most chat apps, has push notifications. In iOS, there are 2 types of push notifications, local and remote. The remote ones require a persistent connection to Apple's Push Notification Service (APNS). I suspected that the app used local notifications and stayed open in the background as having several thousand devices connected to either Apple or Google's push servers over a satellite link would not make much sense. So I pulled out my trusty copy of Charles Proxy and decided to see what traffic was being sent. What I saw just about shocked me.
Connections using the app were NOT using SSL! Since the WiFi was unprotected (it would be cumbersome to give out the WiFI password to so many users), anyone with rudimentary hardware/software could sniff all the traffic. SSL certificates are cheap and easy to deploy, so there is no excuse for every service not to be using them (I use them internally on all services running at my house).
Is it so bad that the app isn't using SSL as no credit card data is flowing through the app? Absolutely! People could be chatting about which rooms they are in and when they are going to meet giving criminals information about when to go into their rooms. People could also tell their friends/family what they have in their rooms making them targets for criminals ("I put the laptop/camera under the bed", for example). Not only was chat not SSL protected, all other aspects of the app's communication were sent in clear text.
Example requests and responses
This request has my Folio number and name; those 2 pieces of information could allow anyone to charge to my room. While they should look at the ship ID (you are given basically a name badge that is your room key and used for purchases), I don't know if the staff always looked at them. My cabin number was also in the request.
GET /FHMA-leviathan/api/Guest?isKiosk=false HTTP/1.1
-
HDMI ARC and HDMI CEC
Several years ago, I purchased a Vizio 5.1 soundbar system. At the time, the way to get the best audio from it was to use the optical input. This worked fine, but required me to use 3 remotes for watching TV; 1 for the TV, 1 for the soundbar, and 1 more for the Roku I had at the time. When the Apple TV 4 came out, I learned about HDMI CEC which is basically a protocol that lets devices talk to each other and have some control. The Apple TV remote then let me turn on the TV and put it in standby without touching the TV remote. That brought me down to 2 remotes. The Apple TV remote could also control the soundbar using IR which brought me down to 1 remote.
This setup worked fine for years, but had a few slight problems. The first is that when I powered on the Apple TV and TV using the remote, I'd have to hit the volume up button a few times to wake up the soundbar and then would have to lower the volume. Second is that putting the Apple TV and TV in standby did nothing for the soundbar; it went into low power mode after awhile, however. The last complaint, albeit minor is that I couldn't use my iPhone or iPad to control the volume.
I'd read about HDMI Audio Return Channel (ARC) where instead of using optical audio out, an HDMI cable could be used which would give better audio. My soundbar didn't have this option (the TV which was older than the soundbar did have it) so I was stuck with the optical audio. In addition, if the devices supported HDMI CEC, the volume could be controlled using another device's remote.
A few weeks ago, I finally decided to upgrade my soundbar to one that supports Dolby Atmos and purchased the Vizio SB36512-F6 which was on sale at Costco. While I have no idea if I'll be able to hear the Dolby Atmos (I need content to support it), I'm pretty pleased with the purchase. This soundbar is connected via HDMI and allows me to use the Apple TV remote (and my iPhone/iPad) to completely control my entertainment devices. In addition, the sound on the bar seems crisper and can now hear the rear speakers much better. It may be that HDMI ARC works better than optical or maybe makes it easier to configure. I am excited to be able to try out Atmos and see if that lives up to the hype in the room I watch TV (it may not as the ceiling isn't that high and due to the layout, it is just part of a larger room).
I love when devices work together and with this new soundbar, I may have found the perfect combination for my viewing experience.
One last thing, the iPhone app for the soundbar is a piece of garbage. I used it to upgrade the firmware on the soundbar and promptly deleted it. Why is is so hard to make a basic app for controlling the settings of a device?