Illness Recovery

This weekend was a bit of a challenge. Rarely do I become ill, and even more rarely do I have to spend PTO due to illness. What began as a sore throat Wednesday turned into a sinus infection by the end of the week. Yesterday was my first day 100% fever-free since Friday. I’ll spare you the gory details, but I’m on the mend, and have felt better and better the last two days.

My Zimbra install is humming along. I’ll do a separate post about it sometime, but I’m pleased with the result. Eddie Jennings Services, LLC will be recording the District 13 Middle School Honor Band event again in Toccoa. This is always fun, since it gives me a chance to reconnect with my band director colleagues.

I’m sure I’ll write a more substantive post later this week, but I wanted to let everyone know I’m still around.

Cert Adventures and Zimbra

Tomorrow I will get back to RHCSA training. I’ve been too busy with my lab to get back into my modules. Speaking of the lab, the colo lab is starting to be transformed into production for the LLC, which is kind of exciting.

Almost all of this weekend was devoted to reading about and deploying Zimbra Open Source. I did use some guides on MangoLassi, but really the installer script does the heavy lifting. The greatest frustration was getting an IMAP client to connect — of course this was instantly fixed once I remembered to open the necessary port on the VM’s actual firewall. >( I always hate when I’m the source of wasted troubleshooting time.

Working with deploying an E-mail server gave me a chance to re-familiarize myself with SPF. Even though it’s easy to use, I had to reference some syntax to make sure I did it right. I also intend to utilize DKIM, which I don’t have a deep understanding of how it works. I’ve followed guides before, but never took a chance to learn what’s actually going on.

On an unrelated note, you probably saw a certificate error on the site for the couple of days. My apologies. I thought I had setup a cron job to have certbot auto renew my certs — apparently I did not. That’ll be fixed within the next three months.

Finally! A Post!

There should’ve been a couple of posts this week, but alas, stuff got in the way.

First, after some of my MangoLassi friends helped point me in the right direction, I fixed an SELinux issue that was preventing me from being able to update anything on my WordPress installation. Through that same endeavor I discovered WordPress CLI. From that point, my life was forever changed. Updating the WordPress installation is a snap, and I’ll probably use it the next time I need to do a full install of WordPress.

So now, I have a nice up-to-date installation, which apparently automatically turned on their new Gutenberg editor. I was a bit shocked, but no big deal. New GUI for writing a post? Fine. I’ll poke around for a bit and figure it out. While writing my first post, I see that an error appears saying “Updating failed.” I then notice that the Preview function loads forever and doesn’t do anything. I might have tried to publish as well. At this point, I was tired and fed up with WordPress, and decided to just lose the post (yes, I know I could’ve just copy / paste).

After looking at SELinux logs and doing a little Googling, I turned to the dev tools in Chrome, and discovered that there’s an issue with mixed content. I dealt with this before when I first deployed wordpress, and apparently the plugin I had at the time wasn’t doing all I needed it to do. I then discovered SSL Insecure Content Fixer and something to add to my proxy server configuration. So far, that did the trick, so I now have a current, functional install of WordPress again.

Already a Week into 2019

While sitting in my chair working on a VM, I realized that the last blog post I did was on December 31st. So here’s a little review over the last week or so. There won’t be much technical content here, just a few musings.

We’ll start with the future. My cat, Lexis, will be having some dental work done at Duluth Animal Hospital on Thursday. This is the first time I’ve ever had a cat undergo a tooth cleaning; however, Dr. Diehl has discussed it with me in the past, and with his last visit, I’ve decided to go ahead and have it done. Dr. Diehl took the time to answer my questions and concerts, so I’m now comfortable putting Lexis through this procedure. He’s a good kitteh, and everyone wish him well!

Today I set a new personal record for jogging a mile: 13 minutes 17 seconds. I’ve been walking and jogging since September to try to fulfill a challenge my girlfriend gave me. She believes if I can jog two laps around a local park (2.5 miles), then I’m likely in good enough shape to return to Tae Kwon Do. Right now I’m jogging 1.25 miles and walking 1.25 miles four times a week. I crossed into 1.25 miles of jogging on January 1st, and I’m hoping by the end of February I’ll be ready to increase to 1.5 miles, which will decrease my walk to 0.75 miles.

And since the new year: I’ve been doing a good bit of planning and thinking about what I’d like to do with Eddie Jennings Services, LLC for 2019. A couple of things are in the works now (which involves the VM I was working on). I’ll do any relevant announcements on the business site.

Before I sign off, I do have to give one technical bit. --strip-components might be the best option available for tar.

Let’s say you have foo.tar.gz and within that is a folder foo and a ton of files. If you did this command tar -xzvf foo.tar.gz, then it’s going to expand everything into a folder named foo. This makes perfect sense, as the foo.tar archive is a folder plus files, and all the tar command did was decompress it (remove the .gz) and expand the resulting tar.

Now let’s say you want to extract just the files of foo.tar.gz into the /tmp directory, but not that top level folder. If you use the above command and add -C /tmp, then you’ll now have a folder named foo within /tmp with all of the files. Here’s where the magic happens. If you run this command, tar -xzvf foo.tar.gz --strip-components=1 -C /tmp, what will happen is the top-level will be removed, and /tmp will now have all of the files that were formerly within the foo directory. If file1 was a file within the folder foo, using this new command, the extracted file will now be /tmp/file1 rather than /tmp/foo/file1. Nifty right?

As I’m still learning my way through common applications like tar, feel free to comment if there’s a better way to do what I described.

RHCSA: Getting Back Into The Swing Of Things

Returning to Linux Academy almost feels like coming home. Everything I’ve worked on so far is a review of when I looked at this material about a year ago. It’s nice having things come back to me.

The observation of the day. I love the fact that, if needed, you can reset the root (or any password) without having to use any kind of third-party software or having boot media. You have to have console access, but that’s it. Here’s the procedure I was taught.

  1. At the GRUB menu, select the desired kernel and edit the configuration (press e).
  2. At the end of the linux16 line, add rd.break.
  3. Press ctrl+x to get into the initramfs shell.
  4. Remount the /sysroot directory using mount -o remount,rw /sysroot.
  5. Change root to /sysroot using chroot /sysroot.
  6. Use passwd to alter whatever user’s password.
  7. Create /.autorelabel for SELinux.
  8. Exit chroot and initramfs. Wait for the system to reboot.

Note: Dr. Google says that doing the SELinux relabel might not be the best thing to do. The above is what I was shown from the Linux Academy tutorial, and there may be a better way to do this.

On Nextcloud: I was able to get the script to work (by taking some ideas from Jared). I might revisit it one day, but the point of the exercise was to play with building a script, which I did.

Scripts for NextCloud

As mentioned in previous posts, I’m playing with NextCloud again. The end goal is to have an installer script that will complete a basic install of NextCloud on Fedora. There are guides and scripts out there, but this is also a learning exercise for me with Bash scripting. What I’m going is taking some guides and scripts out there and from them creating my own.

The project lives here:

RHCSA Certification

I am one of these folks who would like to know everything, and sometimes, I must be brought back to reality and realize that I cannot know everything. The question then becomes with the limited time I have, what do I want to learn? Since “everything” isn’t an option, I’ve decided to spend time learning something that follows a flame of passion: Linux administration. Much like my desire to take a semester of Latin during undergrad, there really isn’t a reason for this outside I simply want to do it. I’m also interested in having a Linux admin job one day, but that’s simply the logical progression from learning about Linux administration.

I do not dislike Windows, and some of the stuff I poked around with in my cert book was useful. I look at it from this perspective. Each day at my job I’m working on honing Windows administration skills and / or research how to do something within the realm of Windows administration. When I’m at home spending my own time and money, I’d rather put that effort toward a personal interest, which right now is Linux administration.

Of course, I could make the argument that I should spend my personal training time working on skills that improve the skills used for my job. My problem with this is that makes the line blurry of when work-time ends and Eddie-time begins. Therein lies the beauty of what I’m doing. Some of the general knowledge and skills I’ve gained from my few Linux administration experiences helped me become a better Windows administrator. Having another way to do the same kind of task (another way meaning I’m doing the same task in a Linux environment rather than Windows) allows me to understand the why and how of the task a bit deeper. So when I return to my original environment, I can often troubleshoot a task better since I have a deeper understanding of whatever that task is supposed to do. I have no doubt that what I learn studying for the RHCSA exam will have a positive effect on what I can do in the Windows environment at my job.

Another week is done

Well, the blog isn’t completely ignored. I could easily not be writing this post.

This has been a busy week. Thursday night had a performance with the Peachtree Symphonic Winds. I don’t remember much before Thursday, but work was busy. Tonight had me trying to cajole NextCloud into working. It’s tough when you have an error griping that something can’t write to a particular direcctory after I set the permissions to 777 just for fun. I’ll take another stab at it this week under the guidance of Jared Busch.

On a more positive note, some weight loss was experienced this week. I flirted with 269 a couple of times. Perhaps soon I can finally break out of the 270s consistently. I also jogged at 14’46” mile according to my Apple Watch. That’s probably a record for me.

I also did a bit of thinking about my learning lab I think I’m going to switch gears again to my original goal of the non-network side of the lab: Linux. I do not dislike Windows, but I always seem to have more fun poking around with Linux in my lab scenarios, and I figure this lab should serve two purposes: gain skills and have fun. So step one: Finally get NextCloud functioning — damn it! I’ll let you know if I’m ever successful with it. After that will be seeing what interests me at the Linux Academy. I’m grandfathered in at a good rate, and I remember enjoying using it in the past.

Unrelated note: Upgrade of my KVM host to Fedora 29 was successful!

Brief Derailment

Life and Final Fantasy VII have got in the way for the last week or so. I cannot complain, for I’m enjoying the distraction. I never did beat Final Fantasy VII back in the day. I know the ending, but I’m endeavoring to actually experience the ending. At the present, I’m trying to gather the Huge Materia.

While debating in my head about continuing to use Office 365 and Zoho for E-mail, I came to a decision. It’s time to deploy and play with NextCloud. Thus, that’ll be my new current project. This will also force me to determine a data backup strategy. Added bonus will be this for a Linux environment. I’ll update the progress here :).