All Posts By

Logan T. Miles M.S.ISA, CISSP, HCISPP

Using Moonlight to Play PC Games Remotely (Part 2)

Ok so I can finally play my PC games while laying on my couch in the other room, that’s neat. However, despite the fact that playing anything on a bike at the a gym makes me feel like a total POS, I want to see if I can play my PC games from there.

Anyone want to check out this sick goal I just made in rocket league? Ok, whatever.

From some initial tests, I was getting good connections, but it would always cut out and terminate the connection to the point where every place, even my work’s fast internet wasn’t playable. It wasn’t until I tried utilizing Zero Tier as recommended on Moonlight’s troubleshooting page that I really found success.

Zero Tier

Zero Tier is a Peer to Peer VPN. It will allow you to connect two devices together such as your phone or streaming PC. Following the instructions online and installing Zero Tier on my phone and laptop has all of our devices connected each other within 10 minutes.

I don’t want to spend any time getting into Zero Tier, but the only change I had to make is that once the new network is added to my devices, I had to re add the computers in moonlight (and re-pairing the device to the streaming PC) by their new virtual IP.

Just to give you and idea of the speed of my PC to my ISP,

Be cool to your cable guy, he might give you a little more megabit than you pay for.

This is the bandwidth to the Frontier server, but since it’ll take some hops it’ll be better to run iperf tests in this instance.

iPerf is a widely used tool for network performance measurement and tuning that can produce various performance measurements for any network. We’ll have it running on the streaming PC and then when measured from the client, it’ll measure it directly, rather than only to whatever Ookla’s various servers on speedtest.net.

iPerf3 is also available on Google Play so we can measure our speed from a phone too.

24 Hour Fitness
1 World Trade Center #110, Long Beach, CA
Network: T-Mobile LTE
Platform: Samsung Galaxy S10e
Distance: <1 Mile

Hats off to the couple of users who did a speed test in boats in the bay.
iperf results shows 23Mbps between my phone and my streaming PC

I felt like this was going to be a “gimmi”, because i’m only a few blocks away from my apartment.

I recorded this video before I installed Zero Tier. This was one of the few use cases that worked without it. Also, as a side note, 4G LTE here is hilariously faster than the 24 hour WiFi network, so I didn’t even bother with the wifi here.

Santa Monica Business Park Parking Lot
3010 Ocean Park Blvd, Santa Monica, CA
Network: T-Mobile LTE
Platform: Samsung Galaxy S10e
Distance: 28 Miles

Sometimes traffic is so bad on the 405 it might be better to chill in my car for an hour before heading out.

T Mobile LTE map for Santa Monica. Notice all of the little ‘v’ symbols stand for “Validated” meaning the speeds are collected from users who agreed to share diagnostic data.

Right now we’re in the top tier coverage area for T-Mobile’s 4g LTE coverage, so let’s rip it.

Iperf test shows around 11.5Mbps which should allow us to play at a lower bitrate.

This is absolutely playable. I don’t have any external footage to show the latency and there is some delay, but it’s well within playability. I feel confident I could fully play the witcher 3 fully in this setup.

This actually works, wow. I know I shouldn’t be that surprised at this point with technology, but I am.

So how much data is this using? With wireshark, I was able to measure the network activity of a couple different settings and as my Grandfather would say, “Good Golly Miss Molly.”

Playing at 30Mbps at 720P will still take up around 500MB in only 222 seconds. Playing at 6Mbps isn’t much better at 900 seconds(15min). So if you play this for an hour at lower settings around 6Mbps, you will use up 2GB of your data plan. Yowzers.

Allow me to present my graph comparing 30Mbps and 50Mbps steams.

It’s not a 1:1 accurate comparison in the wireshark analysis because the video being streamed was slightly different and different amount of movement on screen will translate to the codec to compress at different rates, but i think it still gives conveys a general idea not to remote play over LTE unless you have an unlimited plan. Unless you’re bougie.

Grandma and Grandpa’s house
Scottdale, Pennsylvania
Network: 802.11ac 5Ghz
Platform: Dell XPS 15 2-in-1 9575
Distance: 2,455 miles away

Let’s really see what the US infrastructure can do!

Next time you’re visiting the in laws at thanksgiving, give em the ol’ “I ate too much bird” excuse and run upstairs and get your PJs on, we gon’ play.

Now I know what you’re thinking, if it’s THAT important to me play games remotely, I should be fine justifying the purchase of a much more expensive gaming laptop and you would be right, but still, I wanna see if this is possible.

Not bad for rural Pennsylvania.
[ ID] Interval           Transfer     Bandwidth
[  4]   0.00-10.00  sec  13.8 MBytes  11.5 Mbits/sec               sender
[  4]   0.00-10.00  sec  13.6 MBytes  11.4 Mbits/sec               receiver
iperf Done.

This should be fast enough if we keep the video bitrate to around 5Mbps @ 720P.

I recorded the video with windows embedded screen recorder and then uploaded at 1080P 60fps and Youtube’s algorithm blurs a little more of the already compressed areas. The video is sharp when things are still, but once anything moves it gets very blurry. Technically, however, it is playable. The latency is a little more noticeable here, but not as much as I was expecting beforehand.

Technically, I could sit here and play this, but at this quality, I would rather just wait until I’m sitting at home in front of my TV, but within the next few years or so, I don’t see that continuing to be a problem.

Conclusion

I’ve seen a lot of people argue that Google Stradia and other game streaming services are going to fail, but I think this little exercise showed me that it’s a legitimate next step. The next step.

Everyone else’s servers are going to be far more powerful than my rinky dink PC, and they will have those servers located far closer to everyone than 2500 Miles. With the adoption of 5G, I think it’s perfectly reasonable that conceptually, the line separating PC Gaming and Mobile Gaming will blur to the point that the idea of mobility being this distinct, defining attribute of a sub group of gaming will no longer apply and that those specific mobile attributes will be available for all games. Mobile games will no longer be a thing because all games will be inherently mobile.

Using Moonlight to Play PC Games Remotely (Part 1)

Note: Because everyone’s network is different, The best way to be 100% sure of how Moonlight runs on your network is to try it out on yours. This post is not a technical deep dive, but more of a brief overview with a couple quick examples from within my local area network. The next part is going to go a little more in depth in requirements to stream outside of a LAN and some attempts to stream over different connections.

When I was in high school, circa late 00s, laptops just hit the point where gaming on them was just getting to be a tolerable use case, but not before dropping some serious cash. I saved up for over a year to finally buy a 17 inch Macbook Pro that can boot windows and play some good beefy games, it was the best laptop I ever owned, but it was $2900.

$2900? That’s more than my car.

After college I went back to a desktop and haven’t needed a mobile solution until recently now that I’m out of the house far more often. However, I don’t want to drop more than a grand to play something at high settings. I already have a Ryzen 2700 and GTX 1080 at home that can run almost everything. Is it possible to stream it? Years ago I tried, but I was just on the cusp of it.

Back in 2013, I was one of the few who jumped onto the Ouya bandwagon. It was an android based TV unit that came with a wireless controller. So many different things could be done to it. I mainly used it as a Plex client on my TV for years until I upgraded to an Nvidia shield in 2017, which is practically the same thing, but with a faster CPU. Back then, there was an application called Kainy that allows a user to stream video from PC to the device play games from that client TV. I made a video testing out the performance, it was “ok” at the very best.

It wasn’t optimized, games would run windowed or there would be a half second lag over Ethernet. It was nice to show as a concept, but it wasn’t enough to replace my gaming setup. And back in 2013, very few PC games integrated with a controller as easily. I still made a video, because at the time there was barely any results or documentation on it.

Barely playable over Lan on a few games.

It’s been some years since, so how has performance changed? Some of the big players are coming out of the woodwork and pitching their flags. Google Stradia, Shadow, Bethesda Orion, Microsoft xCloud, judging by the number of players in the game, one has to assume the tech and speeds have caught up, but some services like Shadow cost $24.99 a month to remote into a virtualized desktop running in a datacenter.

I already have the computer than can run the games I want to play, I don’t want to involve a datacenter. I recently bit the bullet and upgraded to 500Mbps up and down and 1 Gigabit on everything Ethernet within the LAN so I should have all the bandwidth necessary. All I need to do is find the right program and that was when I stumbled onto Moonlight.

The best thing about Moonlight, is it’s free.

Moonlight

Moonlight uses Nvidia GameStream’s service, which allows you to stream from a PC to a Nvidia Shield, but Moonlight opens up those streams to outside of a LAN and it’s available on multiple platforms too, so we can use Android, Amazon, ChromeOS, Linux, and iOS devices as well. Hell, I might even try a Rasberry Pi if I have some time,

But here is the kicker, being able to remote in and start a game on my PC is meaningless if it’s unplayable. Meaning, less than 30fps or the lag is so bad it severely affects the gameplay.

So how does it play on various clients in different situations? I didn’t see any results from anyone and I knew I wouldn’t know for sure unless I tried myself. So are any of these various streaming devices able to replace my standard “Sit at my PC Desk and play at my computer” setup?

Host Gaming PC Requirements

Per Nvidia’s website.

  • NVIDIA GeForce GTX/RTX 600+ series GPU (GT-series and AMD GPUs aren’t supported by NVIDIA GameStream)
  • NVIDIA GeForce Experience (GFE) 2.1.1 or higher
  • 720p or higher display (or headless display dongle) connected to the GeForce GPU
  • 5 Mbps or higher upload speed (only required for streaming outside your house)

Once Nvidia GeForce Experience and Moonlight is installed, it auto detects Gamestream PCs over LAN. if not, you can easily add by entering the IP address of the streaming PC.

Double clicking on the streaming PC in Moonlight will prompt the application to pair to the PCs, after following instructions on the streaming PC (enter a code to pair), The computer is ready to play.

Testing Methodology

Streaming PC
CPU: AMD Ryzen 7 1700 @ 3.00GHZ
Memory: 32GB DDR 4 2133Mhz
GPU: Nvidia GTX 1070
LAN: 10Gigabit
Storage: 512GB M2 and 1TB SSD
Controller: SteelSeries Stratus Duo Wireless Gaming Controller
(Works with PC, MAC, Android, and iOS)

I am not going to get quantitative in this part of the blog. I’m going purely qualitative here, I know I have enough bandwidth in the network. If I happen to run into a bottleneck somewhere it’s either with the software of Moonlight or my PCs video hardware.

I’m not going to dig into logs and measure network interference and speeds unless I have to. I’m going to just boot up a couple games and play for around 5 minutes a piece, record a couple clips to showcase the feel of it. The fundamental question I’m trying to answer is.

Is this playable? And I don’t mean barely. Can this be an enjoyable experience without noticing the streaming becoming an ordeal?

Does this alter the experience enough to where I would prefer to stream rather than to sit at my desk as I have for years?


Nvidia Shield (LAN over Ethernet)

Last year I bought an Nvidia shield, I’ve always liked the Android boxes. I have mine hooked up via Ethernet because I stream a lot of 4K content. Using the Nvidia Gamestream application on the device, my tested speed is far above the recommended bandwidth of 12Mbps which is surprisingly slim.

It says >30Mbps, but the bottleneck from the Shield to the streaming PC is the Gigabit interface on the Shield. Everything else is cat 7 and 10G, so the actual bandwidth should be much higher.
Sekiro (Max Settings 1080P)

Moonlight runs great, but that’s not necessarily a surprise at this point. Gamestream has been available for a while for the Shield(~2017), but despite that. I find the video quality to be better than I expected. Not as much blocking and compression that I experienced with Kainy. There has been much work done in remote streaming software since 2013. It feels so refreshing to be able to play my PC games on my couch on the other side of my apartment.

All audio is passed through the stream, so I can pop headphones into my controller and it just works. Back in 2013, Kainy would still play audio on my computer speakers. No audio artifacts or slowdown noticed here.

Outer Wilds (Max Settings 1080P)

For most games, Moonlight will automatically optimize the game’s settings to an optimal level, so no, I’m not constantly fiddling around in the settings when i move from one platform to the other. If you didn’t tell me this was a stream, I would have had no idea.

Dell XPS Laptop (WiFi LAN)

CPU: Intel Core i7-8705G @ 3.10GHZ
Memory: 16GB DDR4 2400Mhz
GPU: AMD Radeon RX Vega M GL
LAN: 801.11ac
Storage: 256GB M2

Ironically, this laptop’s CPU is actually faster than the streamer, but the GPU in the streamer smokes the Vega. I can play Quantum break just fine with a controller, but there is the slightest delay. Small enough to where I’m not sure if it’s moonlight or just Quantum break. It doesn’t necessarily ruin the experience though. I feel as though I could complete a whole play through with this setup. Note: Those green lines on the screen are the LED lights in my room.

Quantum Break (Max Settings 1080P)

I primarily got this laptop for the 4k screen and surprisingly, with a little tweaks on my streaming PC, I can stream Hellblade:Senua’s Sacrifice in 4k, which is larger than the monitor resolution of the streaming PC. So I’m able to play this game at a higher settings than if I actually sat at my desk. Albeit, there is little slowdown, which is primarily due to the streaming PC struggling to run Hellblade in 4k. I have fraps running on both computers, and the streaming PC is dropping below 60fps occasionally.

Hellblade: Senua’s Sacrifice (4K, 30FPS, High settings)

But what is the resource drain on the laptop when it’s playing? So just to give you an idea, I changed to play a windowed session in 1080P and found that Moonlight is very lightweight. The CPU underclocks and the GPU isn’t even utilized to decode video so battery life is extended even further. The screen back light will be probably be the biggest power draw on this laptop while gaming.

Notice the underclock (1.73Ghz from 3.1Ghz)on the CPU to save battery. To the CPU this might as well be a single YouTube video. Also, side note, look how muted a screenshot is when it ignores the HDR information that’s in the video.

With Moonlight’s settings at 1080P, 60FPS, and 98Mbps bitrate, Hellblade:Senua’s Sacrifice plays great. This is my first time playing this game and if I sat down with a session already running, I would not be able to notice this wasn’t running on the laptop until I realized that the laptop isn’t screaming trying to stay cool.

Mad Max (Max Settings @ 1080P)

Mad Max was a game I regularly played on the Shield for a few months before getting into this. Plays perfect on the laptop as well, but all of these controller games can hide a little bit of that latency. Let’s try out a mouse and keyboard game. How about Cities Skylines?

Cities Skylines (High Settings @ 1080P)

Plays great, the mouse lag feels like playing a game with a poorly optimized menu. That even feels like it’s being too harsh. It’s playable, but wiggling the mouse back feels a little less than lightweight as normal. Even with that, this can absolutely replace my desktop gaming experience, lets move on to some of the more obscure assets.

Android Galaxy S8+ (LAN over WiFi)

This past E3, everyone was so impressed with The Witcher 3 running gimped on the Nintendo Switch. So let’s see if I can play it on my phone.

The Witcher 3 (High Settings @ 1080P)

Alexander Graham Bell is shitting his pants in heaven.

Seriously though, this is surreal, I’m not going to be able to go to bed at a reasonable time now. High at 1080P is ok, but I still cant read any text, lowering down to 720P makes this game run butter smooth.

To give you a sense of the latency here, here is GTA V playing on Max settings at 720P. The bigger screen is the main monitor for the streaming PC. As moonlight is running, your main desktop displays the game normally. So if not even the screen, you could lower the settings and simply use it as an input device.

GTAV (Max Settings @ 1080P)

Just look at it. GOD DAMN. I just think that’s the coolest thing.

Asus Junk Laptop (LAN over WiFi)

CPU: Intel Celeron Quad Core N3450 @1.1Ghz
Memory: 3.7GB
GPU: N/A
LAN: Wifi
Storage: 32GB SSD (Really)

Last year, I bought the cheapest laptop I could find and it was far worse than I ever imagined. It had a Celeron processor and only 32Gb of hard drive space. It was a completely wasted configuration. Anytime a laptop says onboard memory instead of RAM, you know you’re gonna have a bad time. My first boot and update filled up the hard drive with windows updates. I had to uninstall it and install Ubuntu Mate. Moonlight has a Linux version available, so let’s boot it up and see if I can stream Sekiro to it.

Sekiro (Max Settings 1080P)

Works like a charm. Maybe I didn’t actually need this XPS as much as I thought? Then again who am I kidding, look at that rinkydink screen. For many of us, Moonlight might suddenly become the perfect answer to the “What do I do with this old as hell laptop?” question. I’m suddenly curious what’s the worst computer I can find that can run Moonlight now.

Apple iPad 3 (2012) (Lan over WiFi)

CPU: Apple A9 @ 1Ghz
Memory: 1GB DDR2
GPU: N/A
LAN: Wifi 802.11a/b/g/n
Storage: 32GB

This is probably the most impressive one yet. This iPad is so old and unable to be upgraded that I cannot even download the Netflix app. I have to go through safari. Moonlight is still compatible though, so lets try Sekiro.

Sekiro (Max Settings 1080P)

This is the first instance where I see a slowdown.

Right at the beginning of the video, there’s a sludgey part where the stream slows down and then speeds back up to catch up. The audio became robotic during it, but besides that small slowdown, I haven’t hit any other performance speedbumps besides struggling hitting the fps ceiling in 4K, which is obviously going to be a bottleneck. 4k Gaming is insane.

Seriously, if I type too fast on this iPad it freezes up like a emo kid with crippling anxiety. The fact it’s mostly playable is incredible. Let’s try a mouse game.

If you’re in a situation where you’re toying with other use cases, definitely look into Moonlight. It might be an option you haven’t considered yet. Personally, after trying a couple different platforms and games, I have no problem in saying that streaming my games within my LAN can absolutely replace my setup for some games, in fact, it might become my default style of playing from now on, but streaming outside of my LAN?

Well that’s another blog post for another day. Until then, take care.

Building a Custom Voice Assistant: Part 1

It’s kind of an open secret that I love Amazon Alexa, my friends know it, my girlfriend knows it, hell even my two roommates who I scare the shit out of when I drop in via Alexa to randomly say hello in the kitchen know it. This is all due to the fact that over the past six months, I’ve been slowly offloading small tasks to Alexa. Turning on the lights, setting timers, playing music, etc. I’ve started experimenting with “scenes”, automated actions such as turning lights on an air conditioner on or off. for when I come home or when I leave for work(turning off lights) It’s surprising how much small these small automations have altered how I operate now day to day. Looking forward, I hope to utilize them even more so, but there’s a problem. See, to put it lightly, I’m not really a fan of Jeff Bezos’ piggy bank listening to everything I say. The problem doesn’t lie only with Amazon though. Every single voice assistant operating via the cloud offered by Google, Facebook, and anyone else cannot be trusted to protect your privacy.

So in an effort to minimize my data footprint, I’m gonna build a private voice assistant myself. I’ve seen numerous guides on building your own Amazon Alexa in a custom piece of hardware such a raspberry pi, but those still send queries to Amazon. I want to build a private voice assistant that can replace Amazon Alexa’s functions for me, but can ensure that my privacy is protected at the same time.

To make a long story about decentralization and privacy by design short(er), all of these companies’ business models operate by mining your data and then selling it. Amazon isn’t offering these Echo Dots and Echo Spots at such an insanely cheap discounts for no reason, Amazon intends to milk it for all it’s worth. There’s even a new Echo Auto that’s only $25 that connects to your phone that you can use in the car. Knowing that the Amazon Alexa app requires location permissions on the phone app, I’ll bet my Grandma’s lucky silver dollar that it’ll continuously send that data to Amazon. It’ll send anything it can store, because some have estimated that Amazon will hit $10 billion in Alexa related sales in 2020 alone. Amazon knows exactly what it’s doing and Alexa is going to be the main point of contact between Amazon and their customers in the next decade going forward, so this pull into the internet of things is only going to ramp up going forward.

Every single time I state an action, my words are translated to text, that text is then parsed, categorized, and then stored. I know because I can open my Alexa phone app and see in the history that the other day, my roommates 30 miles away asked the Alexa in the kitchen if she likes handjobs. She didn’t respond, shes always so coy, but anyway, I digress.

A lot of people are guilty of obfuscation when it comes to the cloud by making it seem far more complicated than it is. The cloud is just someone else’s computer, really. It’s also heavily insinuated that these complex voice assistants that are often branded as an “AI” require processing in the cloud, this is argued because it requires too much processing power to computer these commands.

That is not true. It is perfectly possible to process these sort of commands locally on the device.

This is how we’re told Amazon Alexa operates. Requests are processed through Amazon’s API at their servers and then routed through a device’s manufacturer’s network, and then that’s pushed to the device. For example, the RGB led lights in my living room are from a company called Magic Home that requires their own account and sign up process. This opens a can of worms and begs the question, how much info is being shared between Amazon and Magic Home? Is Amazon allowing Magic Home access to a lot more data than they should? Is even my Alexa connected coffee pot also sending private information somewhere?

The way to resolve this is by processing everything locally on the device. Allow my very shitty diagram to illustrate. I have been looking at a number of different solutions and I think I’m going to try out Snips.ai first. Snips seems to be trying to do exactly what I had in mind. Local processing of queries all done in an open source environment so I can guarantee that my information isn’t sent anywhere I don’t want. I also could even unplug my router and ensure it still operates regularly unlike Alexa who has a stroke when you do that.

I know through the Magic Home app that the modules can be manipulated over a local network via an app, the devices themselves broadcast their own tiny wifi network that phones can use to connect to. If for some reason I can’t go that route and I have to include Magic Home’s servers in this process, I will at least have reviewed the messages myself and would implement any sort of compensating control if possible, but I’ll cross that if necessary. If worse comes to worse and I can’t use the small Magic Home LED module with an led strip (found here), I can directly wire the leds manually with a mosfet.

So how much is this gonna cost anyway?

Actually, not that much.

 

Step 1: Hardware

1.) Raspberry Pi 3 Kit with Clear Case and 2.5A Power Supply – $49.99

I found a decent kit that includes most of everything on Amazon here, but damn, we’re already going over our budget. Just tell yourself you’re saving money in the long run by not paying with your data. Amazon is probably Echo Dots at huge losses anyway and there’s no way to financially compete. I went with the pi 3 because I don’t want to have to worry about any performance bottlenecks. I’m not even putting thought into older Raspberry pis. If we can try this config with older pis, we can do that later and perhaps find cheaper kits to use.

 

 

 

2.) 3.5mm Mini Portable Stereo Speaker for iPod

I bought this 3.5mm speaker, but I didn’t realize it required a battery. Don’t buy that one, buy this one. That has a usb that can keep the speaker powered. Also, don’t buy things on Amazon in a flurry.

 

 

 

 

 

3.) TONOR PC Microphone USB Computer Condenser Studio Mic

Looking for something that’s omni directional. I imagine tweaking the microphone setup to find the sweet spot of sensitivity is going to be a chore. I’ve even seen in other guides of people using a microphone array.

This’ll do.

 

 

 

4.) USB Memory Stick

I already have one of these laying around. You should be able to find a microsd online for less than $10. If you don’t have a microSD, you can boot from a usb stick only after you’ve already booted from a microSD. Just get a microSD you cheap bastard..

 

 

 

 

 

Step 2: Software

Take that microusb and install NOOBS on it here….. right after you realize you don’t have a card reader and quickly bought one. Also, now that we’re realizing we’re missing some basic stuff, make sure you have a keyboard and mouse too.

I’m running Debian (found here), in a virtual machine in Oracle VM VirtualBox (found here) on my PC (found in my apartment). If you’re running linux or Mac, you don’t need the virtual machine and can run the commands straight from terminal. After you have the pi booted and running, make sure you enable SSH on it (how to here) and be sure to harden it (guide here) so it doesn’t spontaneously learn mandarin.

Installing Curl, Node, NPM, and Snips

Open up terminal on the virtual machine (or in terminal on Mac or linux) and run the following

curl -sL http://deb.nodesource.com/setup_8.x | sudo bash -

After that, run the following to make sure node.js (at least v7.5,0) and npm are installed.

sudo apt-get install nodejs

Verify installs on both by running node -v and npm -v

Now run the following to install snips and sam.

npm install -g snips-sam

 

Connecting to the pi

Now, in the virtual Debian on my PC I should be able to connect to the pi by running the following command.

sam devices

Ordinarily, sam will list all the devices it detects and you should be able to connect directly to it by running sam connect raspberrypi.local however, in this scenario it wouldn’t detect my pi. Running ifconfig on the pi will display the ip address. Take that ip and run sam connect <ip address of raspberry pi>

sam connect <ip address of pi>
Enter username for the device: pi
Enter password for the device: 
Connected to <ip address of pi>

Login via your pi username and password and viola, you’re connected and logged in on the pi. Anything run on this command line will be executed in the pi. It’s actually downhill from here once you run the following.

sam init

Watch the command line go to work. It’ll take a few minutes so go make a shirley temple in the meantime. Or if you have friends that are easily impressed, you can let them watch and watch them then assume you’re practically Lisbeth Salander. Woah buddy, big league hacker shit here.

 

Configure Snips Hardware

After the install is complete, run the following to get a status on everything.

sam status

You should get something like this

sam status
​
Connected to device raspberrypi.local
​
OS version ................... Raspbian GNU/Linux 9 (stretch)
Installed assistant .......... Not installed
Status ....................... Installed, not running
​
Service status:
​
snips-analytics .............. 0.55.2 (not running)
snips-asr .................... 0.55.2 (not running)
snips-audio-server ........... 0.55.2 (running)
snips-dialogue ............... 0.55.2 (not running)
snips-hotword ................ 0.55.2 (not running)
snips-nlu .................... 0.55.2 (not running)
snips-skill-server ........... 0.55.2 (not running)
snips-tts .................... 0.55.2 (running)

Lets quickly run through the main pieces and ensure everything is working. Runsam test speaker

With you speaker connected, you should hear a voice. I have an HDMI cord running from my pi and I heard the audio through the TV. So the output from the device is working. Let’s move on to the microphone and plug it in.

After plugging that bad boy in, run sam setup audio

This will allow you to select the microphone.

sam setup audio
Starting microphone setup...
What microphone do you use?
[1] Generic USB
...

After it’s selected, run sam test microphone

sam test microphone
Testing microphone
Say something in the microphone, then press Enter...
...

Try recording a quick joke (and press enter) to hear it back and realize just how unfunny you are.

Then run sam sound-feedback on

This adds the “ding’ when you make a command.

 

Install Demo

We’re almost to the end, run sam install demo

This should install and turn on the snips service and load it with a basic test app. The default test app just translates your speech into text via the STT (speech to text) API and then repeats it back with the TTS(Text to Speech) API. Once it’s done installing, Snips is ready to be operated via speech by speaking “Hey Snips, <say phrase to be repeated>” You can probably ascertain it’s not perfect, but its usable to improve upon and Any custom commands and tweaks including my own lighting setup and automations. I’ll document in part 2.

 

For now, you can say, “Hey Snips, the colossus of clout!” and you can marvel that you’ve made a digital Tommy “Repeat” Timmons from The Sandlot.

 

 

Years (2005-2018)

Since I have moved to Long Beach, I’ve tried to familiarize myself with the city and the communities that reside within it.  That includes many artists who call Long Beach home and my girlfriend recently informed me of the passing of Laura Aguilar, a Latino LGBT photographer who had an impact within Chicana feminism. Much of her work showcases many from marginalized communities.  Some of her work highlights the intersection of the various identities she has that reside within these communities. My girlfriend is Mexican and her identity is also complex and multifaceted. I’ve made a conscious effort to familiarize myself with the various communities, histories, and cultures that my girlfriend finds herself in order to be a better partner for her in the long run.

Aguilar’s work has many of her photos of herself as subject.  That made me think about myself and what kind of photos I have saved, I’ve always tried to avoid being in front of the lens for the majority of my life, but I still have some of myself through the years.  Most of these haven’t seen the light of day.  I have a 6TB raid that stores every photo i’ve ever taken, personal cell phone, SLR, senior photos, prom pics.  I saved everything.  It’s about 100,000 images and about 800GB in size. This past weekend I felt like going through my library and picking a single photo of myself from a year and openly talk about it, how the photo makes me view myself, think about my life, notice the changes in my face, etc.

 

So allow me to present the following,  a collection of photos of myself over the years.

Continue Reading

The Parable of the Hollow Tree

When I was a kid, I used to love climbing trees, it was one of my favorite things to do. On one particular afternoon when I was about 11, I was down the street from my house hiking in a wooded area with a bunch of trees and tall grass. I remember there was this one specific tree with a low branch that looked easy enough to jump up, grab, and climb. However, when I ended up grabbing the branch in mid jump, I was greeted with the loud crack like thunder as the branch snapped and completely ripped a large side off of the tree. Both the branch and myself hit the ground with a significant force as the air was violently sucked out of my lungs. I was lucky the branch fell in front of me and not on top of me because it was extremely heavy and I was alone out there. I could have found myself in my very own rendition of 127 hours.

As I slowly got up off the ground with a groan, I was shocked to see that the tree was almost completely hollow. It wasn’t dead…yet. It still has some leaves on all of the branches, but it was barely strong enough to sustain its own weight. All it took was about 135lbs of my weight at the time to completely deface the tree. It seemed pretty obvious that this tree was on its last leg. It was probably termites or something, but at the time I had no idea how it happened or how such a thing was even possible. I remember the confusion I had as I brushed the dirt off of my cargo shorts. How on earth did this tree get completely hollowed out?

So a little backstory is required here. A couple years ago, I used to be in a Facebook forum with a decent number of other Protestant Christian ministers of varying different denominations and belief systems. From fundamentalists to universalists, the diversity was noticeable. There were times of rabid disagreement, blow ups, and the occasional troublemaker that would be removed from the group, but for the most part, everyone was united by this core belief in Christ and this desire to make his love known to the world. There were many beautiful moments where people found a common ground with each other despite gigantic disagreements elsewhere theologically. Calvinists, Armenians, Preterists, Post-Tribs, and occasionally there would be an emergent guy asking the “Rob Bell questions” that would often stir the pot. The forum was a journey for everyone and eventually it settled down and activity ceased. Everyone reached what appeared to be their destination of belief and there are only so many times you can discuss if baptism is necessary for salvation before you start to desire to pull your hair out. Eventually, the discussions there ceased completely

On Sunday, I stumbled upon a thread where many of these very same Christians were discussing James Fields, the man accused of driving his car into a crowd of protesters that ended up injuring 19 and killing 1. It was a discussion regarding the guilt of the driver, explanations for how, and, or why he wasn’t at fault. A link was eventually posted to an Allen West page pushing a 4Chan /pol/ theory alleging that the driver was driving slow until a protester hit the car with a bat which caused the driver to fear for his life. Of course, the video attached is edited to mute the sound of the screeching tires and engine accelerating as found in the raw video. I read on another post somewhere, but i didn’t manage to get a screenshot, “If you don’t want to get hit, get out of the road”, despite the fact that the road was closed the next block up prior to the protest. The sheer moral disconnect on display sent chills down my spine. I was reminded that a few months ago, I’ve seen memes of cars driving through protests on highways were met with laughter in the comments like “They better not be in front of me ha”. Fox News and The Daily Caller used to have articles advocating for cars to violently drive through protesters. As you would expect after this weekend, both of those posts were removed after the death of Heather Heyer.

This gave me pause, and it made me question myself. I’m not one to stand in the courtyard and cry for crucifixion. I believe in our justice system and long for justice for all, I believe James Fields deserves a fair trial trial by a jury of his peers. Now it is one thing to try and remain neutral, but to try and justify the act is another far more disgusting thing entirely. Eventually, the conversations I found myself in on Facebook began to widen in scope to include the Charlottesville protest as a whole. Who’s responsible for this whole thing? Franklin Graham, who has over 5 million followers and has a significant influence over American Evangelicalism decided to weigh in.

Franklin offers a bold suggestion, blame should instead be assigned to the city council, city politicians, mayor, or even governor. Really? I can think of no easier time of knowing who to assign blame than the swastika flag waving Nazis that are marching in the street starting the protest.

Franklin touched on how long the statue has been there, clearly a subtle endorsement that the confederate memorial should stay there, thus leading into yet another conversation with people about its justification.  The response I received was quite harsh. “Erasing history”, “ISIS does the same”, Orwellian”, “un-American”, and “similar to destroying Mt. Rushmore”, are just some of the lines thrown at me. Any conversation about the confederacy will naturally lead back to the Civil War. I’m a little ashamed to admit it, but I spent a few hours on Tuesday butting heads with people who were not just defending the memorial, but the confederacy itself. Just a couple of the arguments I heard are,

“The vast majority of confederates were not slave owners.”

“These monuments aren’t connected to racism or white supremacy.”

“African Americans fought for the confederacy too.”

“The Civil War wasn’t over slavery, but state rights.”

And don’t forget the most repugnant, reprehensible one.

“Black people were better off as slaves in America rather than back in Africa.”

I’m not going to debunk these myths here, that’ll be another post for another time. However, it was at this moment that I felt like I was on my back again staring back up at that hollow tree. How on earth did this happen? How did we get to a point where almost unlimited excuses are given for the perpetrator, but no one even considers the victims? It seems I have found myself in a culture that I clearly don’t fit in.  It seems to everyone around me that the least of these only considered when it fits their politics, every…single…time.  Does the blood of Heather Heyer not cry out like Abel’s did? Does the blood of millions killed under the Nazi flags waved in history not cry out? Does the blood of those lynched in the name of White Supremacy not cry out like Abel’s did?

Instead, more devotion is given to defend an inanimate statue of concrete and iron, more time is spent sanitizing the confederacy which was quite clearly founded upon slavery, and more work is put in to defend a President clearly comfortable with the support of those reprehensible people.

I’m reminded time and time again over the last couple years when others’ blood would cry out. Philando Castile, Eric Garner, 12 year old Tamir Rice, and so many others. The average response from these people was always the same…bumbling justification or just crickets and shrugs. It still blows my mind that many of these Christians are willing to paint the entire Black Lives Matter movement with a broad brush…but somehow white supremacists and people literally waving swastika flags are given the benefit of nuance. Perhaps this is what Moses felt like when he came down from the Mount Sinai and found everyone worshiping a golden calf. As of this writing, 7 CEOs have now resigned from Trump’s Manufacturing Council due to his comments on Charlottesville, but not a single pastor has resigned from his Evangelical Advisory Council.

Like the tree I tried to climb as a kid, much of American Christianity appeared full of life at first, but at the core, it’s dead and rotten. Now I know that American Christianity is not a monolith. There are much smaller organizations and denominations that have separated themselves from this larger group for these very reasons.  But for the mainstream evangelicals, the termites of politics has burrowed in and now there is no going back. In my opinion, such a reform is impossible. If a pastor tries to correct course, those corrupted people will just leave and go to the church down the street that agrees with them. Greg Boyd lost about a thousand people in his church when he preached a sermon series with the intent of freeing the church from the claws of partisan politics. Now if that church down the street tries to correct course too, well then those same rotten people will start their own church. I know this because some of the racist arguments I heard over the past few days were from people who did just that.

It’s only a matter of time until the the leaves fall, the rest of the tree collapses, and the wood of mainstream American Evangelicalism fully disintegrates into the soil. One can only hope that in the future, something better will eventually grow in its place.