I'm not sure where I got this or how it ended up on my "to read" list, but I'm really quite blown away by it. It amazes me to know that some hardcore Haskell guy somewhere is busy shoving ideas like monads into commonplace languages like Visual Basic. I had also never stopped to consider that Visual Basic is indeed interesting because it embraces the idea "static typing where possible, dynamic typing where needed."
I decided to give Ubuntu 20.04 a try on my 2015 15" MacBook Pro. I didn't actually install it; I just live booted from a USB thumb drive which was enough to try out everything I wanted. In summary, it's not perfect, and issues with my camera would prevent me from switching, but given the right hardware, I think it's a really viable option. The first thing I wanted to try was what would happen if I plugged in a non-HiDPI screen given that my laptop has a HiDPI screen. Without sub-pixel scaling, whatever scale rate I picked for one screen would apply to the other. However, once I turned on sub-pixel scaling, I was able to pick different scale rates for the internal and external displays. That looked ok. I tried plugging in and unplugging multiple times, and it didn't crash. I doubt it'd work with my Thunderbolt display at work, but it worked fine for my HDMI displays at home. I even plugged it into my TV, and it stuck to the 100% scaling I picked for the othe
Comments
We are all Haskell programmers now!