Investing and intrinsic value

One of the things that being a part of the technology industry comes with is the awareness that there’s a lot of money in technology. Almost everyone knows of someone working in a startup or making money off of technology companies. Personally, I’ve invested small amounts (up to $1,000NZD) into NZ tech companies here and there. These have resulted in some gains, some losses. Overall, not much either way (probably would be about “even” if I kept track of things).

One thing that always did bother me was that I had no idea how to calculate the “fair price” (also called “intrinsic value”) of a company that isn’t paying any dividends. (I “roughly” knew how to work out the value of a company that was paying dividends).

A few months ago, I heard about Hatch Investments and that they allow you to buy US stocks directly from NZ (ASB only let you buy NZX and ASX listed stocks and Sharsies didn’t have US stocks, but apparently this is coming).

After signing up to Hatch I first just bought some Index funds, figuring it was a low risk way to see how it all works. The way it works is roughly:

  • Transfer NZD to Hatch bank account
  • Wait a (working) day or two for the money to get converted to USD and made available in your account (not sure whether they take a cut here)
  • Money is there to invest!

Then, at some point I started wondering about individual companies, but still had my doubts as to how to actually work out the “value”. With this question on my mind, I read through one of the Hatch articles, where they profiled a NZ investor by the name of Tom Botica. This led me to check out his Youtube video explaining his method of working out intrinsic value. Going through it, the method made a lot of sense to me, in that I could understand it and couldn’t see any obvious problems with it (it does come with caveats that it doesn’t apply to stocks not from well established, large capitalization companies).

So, armed with a spreadsheet, I was kicked into action by the crash around March 20th, triggered by the COVID-19 virus. I set about working out the intrinsic value for Amazon and Google stocks. Lo and behold, they were under priced and I bought some.

Will this bet pay off? I don’t know. In the short time since then (about six weeks?) a lot of people have started pointing out that the market recovery is not logical and that it resembles a “dead cat” bounce. This is probably true. What will the world look like in a year or two? Who knows.

Implementing Conway’s Game of Life

There’s a phenomenon when a famous writer, musician or artist dies that upon the news of their passing, their works experience a rise in popularity. I don’t know if I’m proud of it (or whether I should be proud of it) but I do this also.

Recently the mathematician John Conway passed away, succumbing to the COVID-19 virus, taken before his time: https://www.wired.com/story/the-legacy-of-math-luminary-john-conway-lost-to-covid-19/

At the same time, I had taken out the Python Playground book from the public library, with the ambitious intention to read it cover to cover. This didn’t happen and I had to return the book to the library, hopefully for some other soul with more time and motivation to read. In an act of stubbornness, I did then go and buy the ebook, with a secret promise to myself that I would read it some day.

One of the chapters that really caught my attention in the book was the one about implementing the “Game of Life”, a famous example of a computer simulation that I had only previously briefly heard about and never tried to implement. The simulation was created by John Conway.

On hearing of his death, I put aside any thoughts of studiously reading every page and trying every example and instead focused on just completing the chapter implementing the game of life. By this time in my life, I had heard so much about this algorithm that it had reached mythical status in my mind. I had expected and mentally prepared myself for several days of trying to implement and understand the code and the algorithm.

Unsurprisingly to people that have implemented the “Game of Life” themselves I read through and finished the code example in about an hour.

Conway’s Game of Life

Then it hit me as to why this was so special and interesting. This whole “Game of Life” which had become ginormous in my mind was actually just a relatively small set of rules imposed a large number of times on an “unlimited” space. The beauty and lesson wasn’t in the complexity of the system, but of the simplicity of it and how it could lead to complex behaviour.

I feel bad that I had let it build up so much in my mind. I feel bad that it took John Conway dying for me to actually just try implementing the game. I am really glad I did though. Life is full of things we know we “should” do or look into or things that we want to know or look into. Unfortunately, we are not invincible and we don’t know when our time will be up. Seemingly small things that we’ve always wanted to do are important to find time for before it’s too late.

What’s your “Game of Life”?

Differences between MySQL and PostgreSQL with INSERT

So, recently when investigating an issue with a MySQL to PostgreSQL migration, I found a surprising difference in behaviour between the two databases.

The difference has to do with how each behave when explicitly passed NULL values in an INSERT statement to columns with defaults or auto-increments (called “serials” in Postgres)

What do I actually mean? To compare, in MySQL you can do this:

mysql> create table baz(whattimeisit TIMESTAMP DEFAULT CURRENT_TIMESTAMP);
Query OK, 0 rows affected (0.02 sec)
mysql> show create table baz;
+-------+---------------------------------------------------------------------------------------------------------------------------+
| Table | Create Table                                                                                                              |
+-------+---------------------------------------------------------------------------------------------------------------------------+
| baz   | CREATE TABLE `baz` (
  `whattimeisit` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP
) ENGINE=InnoDB DEFAULT CHARSET=latin1 |
+-------+---------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)

mysql> INSERT INTO baz (whattimeisit) VALUES (NULL);
Query OK, 1 row affected (0.00 sec)

mysql> select * from baz;
+---------------------+
| whattimeisit        |
+---------------------+
| 2019-08-25 20:35:47 |
+---------------------+
1 row in set (0.00 sec)

Where it decides to revert to the “DEFAULT” value when passed a NULL.

In Postgres, we get the following error:

postgres=# CREATE TABLE baz (whatsthetime TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP NOT NULL);
CREATE TABLE
postgres=# \d baz
                                 Table "public.baz"
    Column    |           Type           | Collation | Nullable |      Default      
--------------+--------------------------+-----------+----------+-------------------
 whatsthetime | timestamp with time zone |           | not null | CURRENT_TIMESTAMP

postgres=# INSERT INTO baz(whatsthetime) VALUES (NULL);
ERROR:  null value in column "whatsthetime" violates not-null constraint
DETAIL:  Failing row contains (null).

Funnily enough, when it comes to PostgreSQL if you don’t pass the column at all, the default is used and you don’t get the same error.

Linux.conf.au summary

Awesome conference, the march of open source continues and technological progress has led us to the point where we can automatically identify predators in NZs bush as well as sequence DNA live during a conference talk.

The Internet of Things theme want as prominent as I thought it would be. The embedded space is still very much a mess in my opinion. To get started you need to make sure you have a ton of adapters, cables and patience. It just feels like yak shaving all the way down at the moment. However, this opinion might be due to my biases coming from the software side of things. The stuff I’m finding hard might just be teething pains or the learning curve.

From a open source philosophy perspective there was a bit of ignoring the elephant on the room, namely the rise of public clouds and the Apache vs GPL license model. However, when it was addressed, it was quite inspirational. One of the main philosophical points being that the four freedoms are inherently tied to what users want and therefore on a long enough timeline going against them will lead to giving users something inferior.

One of the most salient practical points was that using Apache creates a lot of confusion and work for the legal team. It causes each commit potentially to have to go through the “shall we open source or not” process.

Turn on TLS on your Postfix email server

So, as someone who runs their own mail server, one of the things I kept noticing was that when sending to GMail, the message arrived, however, there was a red padlock with a cross through it and the message “No encryption”:

From reading about it, GMail turned on the warning message for all servers that didn’t use TLS when sending.

This was a surprise to me as I had thought that TLS was already enabled on the server. It turned out that the encryption was enabled on the “incoming” connection from the mail client, but not for any “upstream” message sending (i.e. when my server is initiating the SMTP protocol).

After a bit of research, I came upon these two links in the Postfix documentation:

http://www.postfix.org/postconf.5.html#smtp_tls_security_level

http://www.postfix.org/TLS_README.html

The key configuration line is:

smtp_tls_security_level = may

Which is a single line in the configuration that turns on “opportunistic TLS”, where “opportunistic” means that our server will encrypt the connection as long as the recipient server supports TLS, as per the documentation:

The SMTP transaction is encrypted if the STARTTLS ESMTP feature is supported by the server. Otherwise, messages are sent in the clear

You can also define a table where you specify encryption settings on a “per recipient domain/server” level (e.g. to say to always use encryption for GMail, but for all others to use opportunistic encryption).

So, after setting the above configuration and reloading the configuration, you can run the test again and verify that the padlock and encryption warning are no longer present.

Dart language

If you’re using Flutter for your mobile applications, you will be having to use Dart as your language to code in. This was weird for me as I struggled a little bit with the syntax (seemingly some kind of mix of Javascript and Java). As such, I went and did he “Dart for Java developers” tutorials here:

https://codelabs.developers.google.com/codelabs/from-java-to-dart/#0

My initial impressions of Dart as a language:

  • It is a lot like Java
  • It adds a whole bunch of “shortcuts” (e.g. fat arrow “=>”) to become more terse to write
  • It uses a mixture of dynamic and static typing
  • It makes all objects interfaces by default, side-stepping the inheritance vs composition argument to a large extent
  • Seems to be a superset of Java in a lot of ways

 

TIL Geolocation doesn’t work on mobile browsers

So, I’m working on a project to help people that are already “out” find a place to eat and as a part of it, there’s a feature to “find the nearest three places”. So, my initial thought was “that’s cool, I’ll just make a website and get it to use the Javascript Geolocation API” and make it available as a “web app” on people’s phones. This was wrong. It seems like the only real way to get reliable location data is to go “native”.

My initial approach was to figure out the “unknown unknowns”, starting off with “can we reliably get the users’ geolocation through the browser. The “proof of concept” code was as follows:

Firstly, I tried opening the file in Firefox and was greeted by the “allow this site to view location data” prompt. I clicked “Allow”, no joy… the site did not display anything and examining the Javascript/Web console, showed my “Allow” click registering as a “Don’t allow”:

Screen Shot 2018-09-01 at 5.11.45 PM

After checking the network traffic tab as well as looking at and messing around with the “about:config”, I came to to the conclusion that Firefox is just disabling the location API (maybe because I’m running the Developer Edition or something). Running this test on Chrome showed the same initial behaviour (prompt to allow website access to location) however the data ended up being displayed (albeit a bit slow) so the Geolocation was working in Chrome at least.

I thought that maybe the issue was with Firefox not allowing location sharing unless the page was shared over HTTPS, so I moved onto the next stage, which was uploading the file to a web server enabled S3 bucket. Even when accessing the file over HTTPS, the same issue happened. I would click “allow” and the error message would come back saying “user has denied geolocation prompt”.

Next up was the all important mobile test (Safari on iOS). I tried loading the previously mentioned S3 website in Safari on iOS and…. nothing… no co-ordinates. After a bit of thinking, I remembered a previous project where a company with a very usable web app had released a “native” app, with a pretty flimsy reason (so we know when you park up in our shop car park). The speculation there was that they were using the location data all the time and having a native app is the only real way to do this. So… there it is, back to the drawing board.

If it is to be a mobile application, I’m inclined to try to build it using Flutter. But I don’t know… it’ll be my first go at mobile application development, so whatever it is, it’s going to be a learning curve.