We did manage to go kayaking once on Lake Superior

Six months ago, I accepted a new position in Duluth, MN and moved to house we already owned in Wisconsin not too far from Duluth. And by not too far, my commute is about an hour, depending on the weather. It’s about the same drive time I had in Denver, except there’s no traffic. There’s some other changes that we have had to endure as well, some good, some not so good.


As I already mentioned, the drive is about the same in time but about twice as long as in distance. My biggest challenge in my commute is watching for wildlife. So far I’ve lost count on the number of deer that I have seen in the road, or beside the road, and have only had one close call. I saw a coyote taking dinner across the road a few nights ago. I’ve seen at least a dozen bald eagles scavenging along side the road. There have been some turkeys, and a black beer as well. You also have to be careful of the atvs/snowmobiles in the afternoon.


The house is much smaller than the house we rented in Denver, and much much smaller than the house we sold in Tulsa. On the upside of that, the nearest neighbor is 200 yards away. I have a large field that the deer use, and forest all round me. The winter nights are usually very quite and still, while the nights in the Spring have been a chorus of birds, frogs, and insects. The biggest improvement on the housing front is our gigabyte internet speeds available via fiber. 10 times as fast as Denver. The house does need lots of work though, and there is no garage.

Not everything has gone by the numbers though

Shortly after Christmas, we learned that my wife’s grandfather was taken to the hospital by ambulance. Which at his age, isn’t a shock in and of itself. The change up came when the county moved to have him placed in a nursing home and appointed my wife as his guardian. We then learned that what we thought was general forgetfulness was actually advanced dementia. Which, has been a challenge to say the least.

One of the responsibilities that accompanies guardianship is protecting the assets. Oh wow, what an interesting development that has been. Lets just say, some peoples treasures are most people’s junk. For example. 4 or 5 years ago, we got rid of some crappy walmart wardrobe closets that were in our house. Her grandfather had come by and asked if he could have them. Sure, if you can use them or know someone who can, by all means. Well, we discovered those same wardrobes in the corner of the garage water damaged beyond repair. Apparently the dementia had crept in awhile back. So there’s that.

But that wasn’t the only fun. The winter was long, very cold, and snowy. Lets just say my snow blower got a work out.

Still though

With all of the challenges, I’m still glad we moved. The stress is much less than Denver and I feel like we can focus on less stressful living. It’s easier to make fitness a habitat, usually hitting the gym after work. I haven’t had much time to work on my programming skills as I would like, but I imagine things will begin to ease up and allow more time. We’re hoping to get out to get some kayaking and fishing in soon as well.

This week I got asked to help a developer figure out why a failover wasn’t working on a Postgres cluster. Interesting enough I guess. Especially because I don’t know anything about Postgres. Good time to learn I guess? I don’t know. Anyway, I accepted the directive and started trying to get familiar with the new RDBMS system. The dev sent me a username and password to work with, so I got to work trying to figure out the issue.

The Work

Starting off, was a failure in communications. I was sent a username and password, and naturally assumed this was as postgres user. So I loaded up PgAdmin and tried to go to work. Ya, no that wasn’t happening. It turns out, the username and password given to me was for a Linux account. A Linux account without the ability to sudo to root (or anyone else). So after going in circles for the morning, I finally got my access straightened out.

Moving on, I setup a user for me in Postgres, modified the necessary conf files to give myself to login into Postgres with PgAdmin from my laptop. Sadly, between the Googling and continued back and forth over my access, this took the better part of the day. Once in though, it didn’t take me to long to figure out the basic lay of the land. I guess when you get down to it, an RDBMS is a RDBMS is an RDBMS.


This is where the process takes a turn. In SQL, when you think of failover replication with Availability Groups you know you can bounce the nodes pretty easily. You can make any member node of the AG the primary replica without much work. Makes sense. No so in Postgres/

In Postgres native replication, to failover you have to shut the primary node down and then manually (or through a trigger file….*still not sure about this) promote the standby server to master. Then, when you bring the previous master back online, it has to be brought up and configured as a standby server. At least, that is my understanding of the process so far. It seems a little jerky to me to have to jump through these kinds of hoops to have a failover configured. Then again, there is probably some framework or script that handles failovers better. I’m still working on understanding if this is actually how Postgres handles the failover.

Still learning….

I’ve been meaning to work on getting my mind wrapped around Source Control for a while now. I was all hyped about it after a recent SQL Saturday, then I wanted to make it a learning goal for the year. Looking back at those posts, I guess the goals was to do more “DevOps” type stuff, which includes source control so there ya go.

Past Failures

This wasn’t the first time I tried to work source control. When I was working on my undergrad long ago, I wanted to get more experience as a developer and tried to work on an open source project with a friend. Unfortunately, I was remote to the project and getting my mind wrapped around how to implement source control frustrated me to the point of giving up.

Then I spent some time trying to figure out how to set up a local repository that I could use. Here again, it became frustrating as SSMS doesn’t have a built in method for utilizing source control. And using Visual Studio is a complex, confusing endeavor for the everyday DBA. In the end, I just stuck all of my scripts into an organized folder in My Documents and called it good.

This Time

I have taken the approach of checking in all my scripts to a code repository, then pushing that code to BitBucket. Why BitBucket? Well because it’s what the organization I work for uses, so I’m trying to kill two birds. Learn how to effectively use source control while I also figure out how to navigate BitBucket. I’m still on the fence about using the rest of the tools from Atlassian.¬† To push the code I have been using GitKraken¬†which works great for both Windows and Mac.

Another avenue of getting used to using Source Control I have been utilizing is Visual Studio. Again, two bird mentality here. I get familiar with both source control and writing code in Visual Studio. Is that important for a DBA? I don’t know, there are some advantages to using Visual Studio to develop and leave SSMS for the administration work. Of course, at this point I am still too much of a novice to notice much difference.

Just have to keep plugging away at it.

Much more to learn

There’s still much more stuff I need to figure out. Like how to branch, merge, correct mistakes, and recover from those opps moments. Such as when I accident deleted my repo setting this up. Thankfully, I managed to recover most of my scripts. It keeps things interesting right?