Posts for year 2018

Pizza, another family favourite

Another family favourite recipe is Pizza - made from scratch and baked in our oven on pizza stones. The recipe for the dough is really simple; I got it from "Better Homes & Gardens" magazine years ago, written up by Karen Martini

Ingredients

  • 1 Dry Yeast Sachet (7g)

  • 125ml warm water

  • 250ml cold water

  • 400g White Plain Flour (+ more for dusting and kneading)

  • 100g Semolina

  • pinch of salt

  • pinch of sugar

  • 1 tbsp olive oil

Utensils

  • 3L saucepan, with lid

  • rolling pin

  • 2L mixing bowl

  • pizza stones

  • large spoon (for stirring)

  • kitchen scales

  • cork mats

  • large spatula

  • Ulu

  • Silicone pastry mat

Process

  • Fill your kettle, and boil it

  • place the salt and sugar into the mixing bowl, along with 250ml cold water, and then 125ml of warm water (from the kettle). If the water feels hot when you dip your fingertip in, then allow it to cool for a few minutes. Stir the water, salt and sugar together.

  • Add the contents of the yeast sachet, swirling it around in the bowl.

  • Wait for about 5 minutes for the yeast to activate. After you place the yeast in the bowl, it will drop to the bottom. After a few minutes (usually 5) you will notice a bubbling action and observe the now-activated yeast coming to the surface in something akin to a foam.

  • Gently but firmly stir the flour and semolina into the bowl, until you've got most of the flour-y bits combined into a ball, then tip out onto your bench. Preferably with a

    /galleries/food/recipes/Pizza-Goodness/baking-sheet.jpg

    baking sheet which you've lightly dusted with flour

  • Dust your hands with flour, then work all the remaining bits of flour into the ball, and knead it for about 2 minutes. The dough ball should be glossy.

  • Wash out your mixing bowl, ensuring that you get all the remaining specs of dough out, then dry the bowl. You might want to re-boil your kettle at this point.

  • With your 3L pot underneath, pour a little bit of the kettle's hot water onto the outside of the bowl, then fill the pot with about 1-1.5L of the remaining water.

  • Dribble the olive oil around the inside of the bowl so that you have even coverage.

  • Place the dough ball in the bowl, and roll it around in the oil so the whole thing is covered in a fine film of oil.

  • Cover the bowl with clingfilm, then place it on top of the pot and put the lid on.

    /galleries/food/recipes/Pizza-Goodness/dough-in-pot-rising.jpg
  • Allow the dough to rise. This should take about 45 minutes. During this time you should clean your baking sheet and benchtop, and prep any toppings that you want on your pizzas.

    /galleries/food/recipes/Pizza-Goodness/dough-after-rising.jpg
  • Heat the oven to 250C, with the pizza stones on the racks.

  • Once the dough has doubled in size, it is ready to knead again and roll out to the size of your pizza stones. Tip it out onto your dusted baking sheet (or benchtop), discard the clingfilm, and then knead the dough ball with a little more flour.

  • At this point we divide the dough into portions by weight. Over the years I've determined that the ideal weight of dough per dough-ball for our pizza stones is 350g. I generally try to get two balls at 350g and then whatever is left is a smaller pizza which I experiment with toppings that the children aren't interested in trying yet.

  • After dusting the rolling pin with flour, roll the dough ball out on the baking sheet, so that it is about the same size as the pizza stone.

    /galleries/food/recipes/Pizza-Goodness/dough-on-baking-sheet.jpg
  • Take your pizza stones out of the oven, and leave all but one in a safe place. This other one, place on the bench on top of cork mats

  • Transfer the base to the stone.

  • Apply toppings, and then place into the oven and wait until baked. Repeat for the rest of the dough.

  • The pizza is properly baked when it has a crispy underside. Check this by sliding a spatula underneath the base while it is still on the stone after the pizza has been in the oven for about 10 minutes. If the pizza is ready, remove from the oven and let it rest.

  • Once the baked pizza has rested sufficiently (it won't be staggeringly hot to touch), you need to cut it. I use the Ulu for this

    /galleries/food/recipes/Pizza-Goodness/Ulu-on-stand.jpg/galleries/food/recipes/Pizza-Goodness/Ulu.jpg
  • Enjoy!




Replacing your self-signed certificate in svc:/system/identity:cert

One feature of your freshly installed Solaris 11.4 instance that can fly under the radar is the svc:/system/identity:cert service. This provides you with a system-generated (that's your system, not Oracle) certificate which is self-signed, and which a number of other services depend upon:

$ svcs -D identity:cert
STATE          STIME    FMRI
disabled       Apr_26   svc:/system/rad:remote
online         Apr_26   svc:/system/ca-certificates:default
online         Apr_26   svc:/milestone/self-assembly-complete:default
online         May_03   svc:/system/webui/server:default

By-the-bye, the svc:/system/ca-certificates service helps keep the system copy of Certificate Authority certificates updated.

So what do you do if you want to get past an error like this when you try to access https://127.0.0.1:6787 so you can try out the WebUI?

/images/2018/05/self-signed-cert-error.png

Once you've obtained a CA-signed certificate, it's actually very easy to do:

# SVC=svc:/system/identity:cert
# svccfg -s $SVC setprop certificate/cert/pem_value = astring: "$(cat /path/to/signed/certificate.crt )"
# svccfg -s $SVC setprop certificate/cert/private_key/pem_value = astring: "$(cat /path/to/signed/certificate.key )"
# svccfg -s $SVC setprop certificate/ca/pem_value = astring: "$(cat /path/to/issuer/certificate.crt )"
# svcadm refresh $SVC
# svcadm restart -sr $SVC



Where did the old posts go?

When I set up my blog in 2006, I chose to use roller, which was the same engine that the now-defunct blogs.sun.com used at the time. Later, having gotten tired of the interface and being more impressed with Wordpress' facility for image galleries, I started running my own instance of that.

After a while, however, the frequent CVEs in both PHP and the Wordpress base+plugin systems got me sufficiently motivated to change that I did.

My friends Shawn and Liane suggested going for a static site generator, so after having a brief look at Pelican and Nikola, I instead chose Hugo.

Hugo's chief attraction was the relative ease with which I could create image galleries, along with the ability to import Wordpress sites. Yay, thought I, I can have a fairly seamless transition, and away we went.

I didn't actually check the site backup that I made before turning off the Wordpress instance, however, and when I went looking for the pkgrepo procedure that I used for building darktable and didn't find the content that I wanted, I was a bit annoyed.

Given that Solaris' packaged version of Go is somewhat behind the community version, and that Hugo depends on a much newer version, this was also the trigger for me to re-explore Pelican and Nikola, both of which are written in Python. After a brief flirtation with Pelican I settled instead on Nikola and did the initial Hugo to Nikola migration fairly easily. Chris pointed me to the gallery directive plugin and I was able to make a start with some of my more recent gallery collections. A quick implementation of a PR for captioned and ordered images got me the rest of the way and then I could get back to the real problem: the missing content.

Fortunately for me, the wayback machine had a copy of the old site entries, and with a quick installation of the wayback machine downloader I was able to grab the whole site as it was up to 2016.

Phew!

Except that all the post files were chock full of Wordpress and roller's javascript and expanded css, which were a real mess to look at, let alone extract the posts from.

So I did what anybody else would do, and wrote some Python using BeautifulSoup to provide a best effort extraction which would translate the html+js+css into the plain-text (and therefore portable) ReStructured Text format. Now since this is a best effort attempt, I'm not too concerned about getting the output as perfect rst which matches my original post, and I went through about 20 different entries to tidy up the input so that the script would produce something close to what I wanted. I knew that I'd have to go and post-process quite a few entries as well, I just wanted to not have to do too much to get that going.

I've converted about 340 posts extracted from the wayback machine archive using this script wp-to-rest.py which took about 2 days to write and finesse, and another day or so to muck around with several posts to get them into better shape (ie, running nikola build doesn't yell at me). There are still a bunch of broken links in there, but at least now I've got all my content back, and can very easily fix things up as I get the inclination.




Solaris Analytics Collections, partitions, slices and operators

Having poked around with Solaris Analytics, and the WebUI a little, you might have wondered what information or statistics we gather by default. For instance, what are the statistics which we collect to make the Solaris Dashboard sheet useful?

/images/2018/05/solaris-dashboard.png

The feature we provide to make this happen is the Collection. Collections give us a handy shorthand for gathering statistics. We ship with several collections:

# sstore list //:class.collection//:collection.name/*
IDENTIFIER
//:class.collection//:collection.name/root/apache-stats
//:class.collection//:collection.name/root/compliance-stat
//:class.collection//:collection.name/root/cpu-stats
//:class.collection//:collection.name/root/network-stats
//:class.collection//:collection.name/root/solaris-dashboard
//:class.collection//:collection.name/root/system

Listing collections is a privileged operation; if I run the command above as myself then I get a very different result:

$ sstore list //:class.collection//:collection.name/*
Warning (//:class.collection//:collection.name/*) - lookup error: no matching collections found

The collection which is enabled by default is //:class.collection//:collection.name/root/system, and you can see what it gathers by running sstore info on it:

# sstore info //:class.collection//:collection.name/root/system
Identifier: //:class.collection//:collection.name/root/system
  ssid: //:class.system//:*
 state: enabled
  uuid: 7a002985-2cf4-4965-adc9-b53116d8ae67
 owner: root
 cname: system
crtime: 1523243338963817

I quite like having the solaris-dashboard and apache-stats collections enabled, and that is really easy to do:

# sstoreadm enable-collection \
    //:class.collection//:collection.name/root/solaris-dashboard \
    //:class.collection//:collection.name/root/apache-stats

One thing I'm always concerned with, since our family media server is, shall we say, homebrew, is whether my disks are doing ok. Fortunately for me, it is very easy to cons up my own collection and stash it in /usr/lib/sstore/metadata/collections:

[
    {
        "$schema": "//:collection",
        "description": "disk-related statistics",
        "enabled": true,
        "id": "disk-stats",
        "ssids": [
            "//:class.disk//:res*//:*"
        ],
        "user": "root"
    }
]

and once you've restarted sstored you can see it like so:

# sstore info -a //:class.collection//:collection.name/root/disk-stats
Identifier: //:class.collection//:collection.name/root/disk-stats
   ssid: //:class.disk//:res*//:*
  state: enabled
   uuid: bee6c5c5-487e-4376-9d91-f4eb933fd64e
  owner: root
  cname: disk-stats
 crtime: 1525373259871426

[Note that you do need to ensure that your collection validates against the collections schema, so run soljsonvalidate /path/to/my/collection.json, and if you need to reformat it, soljsonfmt /path/to/my/collection.json].

So that's useful - now what? How about looking at the illegal requests counter? When you run iostat -En that information is jumbled up with all the other errors and can be a little difficult to distinguish:

$ iostat -En sd0
c2t0d0           Soft Errors: 0 Hard Errors: 0 Transport Errors: 0
Vendor: ATA      Product: WDC WD30EFRX-68E Revision: 0A82 Serial No: WD-WCC4N7CNYH0S
Size: 3000.59GB <3000592982016 bytes>
Media Error: 0 Device Not Ready: 0 No Device: 0 Recoverable: 0
Illegal Request: 537 Predictive Failure Analysis: 0 Non-Aligned Writes: 0

With Solaris Analytics, however, we can gather all of those errors together in aggregate and partition them at the same time. This command shows us the most recent data point (the -p -1 argument):

$ sstore export -p -1 "//:class.disk//:res.name/sd0//:stat.errors//:part.type"
TIME                VALUE IDENTIFIER
2018-05-08T19:07:44  //:class.disk//:res.name/sd0//:stat.errors//:part.type
                    device-not-ready: 0.0
                    hard-errors: 0.0
                    illegal-requests: 537.0
                    media-errors: 0.0
                    no-device: 0.0
                    non-aligned-writes: 0.0
                    predictive-failure-analysis: 0.0
                    recoverable: 0.0
                    soft-errors: 0.0
                    transport-errors: 0.0

That's a bit more useful! (Yes, having to use sdN rather than cXtYdZ is a pain, sorry). So... how about just looking for the illegal-requests? That's where we really make use of the partition concept - and let's throw the argument to give a daily total from the start of this month (May 2018):

$ sstore export -t 2018-05-01T00:00:00 -i 86400  "//:class.disk//:res.name/sd0//:stat.errors//:part.type(illegal-requests)"
TIME                VALUE IDENTIFIER
2018-05-01T00:00:00  //:class.disk//:res.name/sd0//:stat.errors//:part.type(illegal-requests)
                    illegal-requests: 27.0
2018-05-02T00:00:00  //:class.disk//:res.name/sd0//:stat.errors//:part.type(illegal-requests)
                    illegal-requests: 27.0
2018-05-03T00:00:00  //:class.disk//:res.name/sd0//:stat.errors//:part.type(illegal-requests)
                    illegal-requests: 27.0
2018-05-04T00:00:00  //:class.disk//:res.name/sd0//:stat.errors//:part.type(illegal-requests)
                    illegal-requests: 79.0
2018-05-05T00:00:00  //:class.disk//:res.name/sd0//:stat.errors//:part.type(illegal-requests)
                    illegal-requests: 174.0
2018-05-06T00:00:00  //:class.disk//:res.name/sd0//:stat.errors//:part.type(illegal-requests)
                    illegal-requests: 270.0
2018-05-07T00:00:00  //:class.disk//:res.name/sd0//:stat.errors//:part.type(illegal-requests)
                    illegal-requests: 365.02
2018-05-08T00:00:00  //:class.disk//:res.name/sd0//:stat.errors//:part.type(illegal-requests)
                    illegal-requests: 461.0

Much more useful - and observe that because we're using () to extract the partition element, we need to quote the argument so the shell doesn't get snippy with us.

To finish this post, let's take a look at two more really useful features, slices and operators. One operator that I'm particularly happy with is //:op.changed, which shows you when a statistic value changed. While not particularly useful for volatile statistics on a per-second basis (watch //:class.system//:stat.virtual-memory for a few minutes and you'll see what I mean) if you aggregate such stats over a longer time period, such as a day, you can get a better understanding what that stat is doing. So, with disk errors again, but on a daily basis (-i 86400) from the start of this month (-t 2018-05-01T00:00:00):

$ sstore export -t 2018-05-01T00:00:00 -i 86400  "//:class.disk//:res.name/sd0//:stat.errors//:op.changed"
TIME                VALUE IDENTIFIER
2018-05-01T00:00:00 27.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-04T00:00:00 79.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-05T00:00:00 174.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-06T00:00:00 270.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-07T00:00:00 365.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-08T00:00:00 461.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed

Finally, slices. These are //:s.[....] and you enter the statistic names which you wish to extract inside the brackets - and once again I'm using the //:op.changed to constrain the output:

$ sstore export -i 86400  //:class.disk//:res.name/sd//:s.[0,28]//:stat.errors//:op.changed //:class.disk//:res.name/sd//:s.[0,28]//:stat.//:s.[vendor,serial-number]//:op.changed
TIME                VALUE IDENTIFIER
1970-01-01T10:00:00 27.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-04T10:00:00 119.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-05T10:00:00 214.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-06T10:00:00 309.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-07T10:00:00 405.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
2018-05-08T10:00:00 500.0 //:class.disk//:res.name/sd0//:stat.errors//:op.changed
1970-01-01T10:00:00 169.0 //:class.disk//:res.name/sd28//:stat.errors//:op.changed
2018-05-04T10:00:00 120.0 //:class.disk//:res.name/sd28//:stat.errors//:op.changed
2018-05-05T10:00:00 215.0 //:class.disk//:res.name/sd28//:stat.errors//:op.changed
2018-05-06T10:00:00 310.0 //:class.disk//:res.name/sd28//:stat.errors//:op.changed
2018-05-07T10:00:00 406.0 //:class.disk//:res.name/sd28//:stat.errors//:op.changed
2018-05-08T10:00:00 501.0 //:class.disk//:res.name/sd28//:stat.errors//:op.changed
1970-01-01T10:00:00 ATA      //:class.disk//:res.name/sd0//:stat.vendor//:op.changed
1970-01-01T10:00:00 Z1D5K89L //:class.disk//:res.name/sd0//:stat.serial-number//:op.changed
2018-05-04T10:00:00 WD-WCC4N7CNYH0S //:class.disk//:res.name/sd0//:stat.serial-number//:op.changed
1970-01-01T10:00:00 ATA      //:class.disk//:res.name/sd28//:stat.vendor//:op.changed
1970-01-01T10:00:00  //:class.disk//:res.name/sd28//:stat.serial-number//:op.changed

For more information about operators, slices and partitions, have a read of ssid-op (aka ssid-op(7)).

Tune in next time when I'll guide you through the process of using a proper certificate for your WebUI instance, rather than the default self-signed certificate.




Chicken Roulade

I've been wanting to write up my recipe list for a while now, and starting to use a new blog engine is as good a reason as any to finally start doing so.

Ingredients

  • 2 large chicken breasts (about 350-450g each)

  • 100g baby spinach leaves

  • 80-100g sun-dried tomato strips

  • 50ml canola or grapeseed oil (for frying)

Utensils

  • 2x 80cm cooking twine

  • clingfilm

  • meat tenderiser mallet

  • spatula

  • tongs

  • large chopping board

  • frying pan, medium-large diameter

  • baking tray.

Process

  • Turn your oven to 180degC. The rest of the steps take about 15 minutes.

  • Tie a 5-6cm loop in one end of each piece of twine. You're doing this now because at the point when you really need the loop, your hands will be sticky with chicken and sun-dried tomato.

  • Fan out the babay spinach leaves onto a plate.

  • Place the sun-dried tomato strips on a plate.

  • Place the chicken breasts on your board, and cover them and the board with clingfilm. It doesn't need to be tightly wrapped, but you should ensure you have a wide margin around each side.

  • With your meat tenderising mallet, whack the chicken breasts until they are about 1cm thick.

  • Remove the clingfilm and discard.

  • On each chicken breast, place a layer of the baby spinach leaves. You want to cover it, but not too thickly.

    /images/2018/04/view-from-end.jpg
  • On top of the layer of baby spinach leaves, spread the sun-dried tomato strips on one half.

    /images/2018/04/ready-to-roll.jpg
  • This is the tricky bit - and why you're glad you put the loop in the twine earlier!

  • Carefully roll one chicken breast over, to make a sort of a sausage. Take the loop end of one piece of twine, place it at one end of the sausage, thread the other end through and then wrap the length around the rest. Fold the end underneath another part of the string so that it is reasonably tight. Repeat for the other chicken breast.

    /images/2018/04/one-ready.jpg
  • Heat your frying pan to searing temperature and then add in the oil.

  • When the oil is hot enough, place the roulades in the pan and let them sear - but do not let them cook.

    /images/2018/04/searing-start.jpg
  • With your spatula, unstick the roulades, then with the tongs give a 1/3 or 1/4 turn and keep on searing. Repeat until each roulade is seared all over.

    /images/2018/04/searing-one-turn.jpg
  • Transfer to the baking tray and then place into the oven, on a middle rack. Cook at 180degC for 23 minutes.

    /images/2018/04/ready-to-roast.jpg
  • At the 23 minute mark, remove from the oven and check that they have cooked through using a meat thermometer. If they have, then rest for 5 minutes before carving. If they have not, then return to the oven for another 3-4 minutes.

  • Slice into about 1cm thick portions, and serve.

    /images/2018/04/carved.jpg



Another (blog) engine change

Last year I wrote that I was changing my blog engine. I've made another engine change over the last few days, to Nikola . The primary driver of this change is that Solaris' shipping version of Go is quite behind the version required by Hugo and that has an impact on my ability to work on it if I desired.

Another reason is that I've been working on a work-related post or two lately, which I'm posting on this blog since they are for work that I've done which build on $DAYJOB.

In the end, the choice came down to Pelican or Nikola and the relative effort involved in translating Hugo 's gallery format into something else. The effort in translating my Markdown-formatted files into ReSTructuredText is pretty minimal, it will just take time.




Monitoring my PV Inverter

I'd like to share with you a way to build on the Solaris Analytics components sstored and WebUI that I use with our PV inverter.

In April 2013 we had 15 solar panels installed on our roof, providing 3.9KW. This came with a JFY SunTwins 5000TL inverter, which has the useful feature of data monitoring via an RS232 serial port. The installers provided a cd with a 32bit Windows app which, while useful to start with, did not allow me to push the generation data up to a service like pvoutput.org.

Not being interested in leaving a laptop running Windows on during daylight hours, I searched for open source monitoring software, finding Jonathan Croucher's solarmonj via whirlpool. Importantly, I also found a programmer reference manual for the inverter, although it is poorly translated. I also discovered that there are a number of gaps between what it describes and what packet analysis and the windows app actually do.

I'm not too keen on C++, but it built fine on the raspberry pi that I had available to use, and ran well enough. With a bit of shell scripting around it I was able to upload to pvoutput.org and see how things were going.

Solarmonj has some attributes which I dislike: it's not (wasn't, at that time) a daemon, logfile generation isn't enterprise-y, it doesn't take command line arguments (so the device path is compiled in), doesn't handle multiple inverters from the same daemon, and doesn't let you send any arbitrary command listed in the spec document. It's single purpose - but performs that purpose well enough.

(Checking out Jonathan's github repo, I see that the version I was using did in fact get an update about 3 years ago, but I had the utility in "set and forget" mode, so never noticed).

I've written a new monitor which builds on Croucher's work, with these features: - written in Python, daemonises, is configurable, handles multiple inverters, updates pvoutput.org and sstored as well as writing to a local file.

With this project I'm also providing sstored configuration files and a WebUI sheet:

JFY Inverter sheet in the Solaris Analytics WebUI

Zerothly, you can get all the code for this project from my Github repo. It's still a work in progress but it's at the point of being sufficient for my needs so I'm happy to share it.

Since this post is about how to plug your app into Solaris Analytics, I won't delve too much into the SMF and IPS components of the code.

At the end of my previous post on the work blog I mentioned that I would discuss using the C and Python bindings for the Stats Store. This just post covers Python bindings, leaving detailed coverage of the C interface for another day.


First of all, let's have a look at two basic architecture diagrams:


The C and Python bindings enable read-write access to sstored, so that you can write your own provider. We call this a "userspace provider" because it operates outside of the kernel of sstored.

For both bindings, we have three methods of putting data into sstored:

  • per-point synchronous (using a door_call())

  • per-point asynchronous (using a shared memory region)

  • bulk synchronous (using a door_call())

The code that I've written for this utility is using the asynchronous method which (at the bottom of the stack) depends on an mmap region which is shared between the daemon and the client process. Since we do not have a real speed constraint for updating sstored, I could have used the synchronous method. I'll discuss the bulk synchronous method later.

To start with, my daemon needs a connection to sstored. Assuming that the service is online, this is very simple:

from libsstore import SStore, SSException

sst = SStore()

(I've written the daemon so that each attached inverter has its own connection to sstored, so sst is a thread instance variable).

The user that you run this code as must have these authorizations:

  • solaris.sstore.update.res

  • solaris.sstore.write

Add these to the user by uttering

# usermod -A +solaris.sstore.update.res,solaris.sstore.write $USER

Once you've got those authorizations sorted, you can add the appropriate resource to the class. I've chosen to name the resources with each inverter's serial number. My device's serial number is 1522130110183:

RESOURCE_SSID_PREFIX = "//:class.app/solar/jfy//:res.inverter/"
STATS = [
    "temperature",
    "power-generated",
    "voltage-dc",
    "current",
    "energy-generated",
    "voltage-ac"
    ]

# hr is Human-Readable, after we've processed the binary response
# from the inverter
hr_serial = 1522130110183
resname = RESOURCE_SSID_PREFIX + hr_serial

try:
    sst.resource_add(resname)
    self.print_warnings()
except SSException as exc:
    print("Unable to add resource {0} to sstored: {1}".format(
        resname, SSException.__str__))
    usesstore = False
    raise

stats = []
for sname in STATS:
        stats.append("{0}{1}//:stat.{2}".format(
            RESOURCE_SSID_PREFIX, hr_serial, sname))
try:
    stats_array = self.sst.data_attach(stats)
    print_warnings()
except SSException as exc:
    print("Unable to attach stats to sstored\n{0} / {1}".format(
        exc.message, exc.errno), file=sys.stderr)
    usesstore = False
    sst.free()
    sst = None

Each time we query the inverter, we get back binary data which needs decoding and extracting. This is the "feature" of the documentation which annoys me most: it doesn't match the data packet returned, so I had to go and click through the inverter's front panel while watching retrieved values so I could determine the field names and units. Ugh. Anyway, inside the thread's run() method:

stats = query_normal_info()
if not stats:
    return
if usesstore:
    sstore_update(stats)

Now comes the magic:

def sstore_update(vals):
    """
    Updates the stats in sstored after stripping out the ignore[12]
    fields in JFYData. We're using the shared memory region method
    provided by data_attach(), so this is a very simple function.
    """

    values = {}
    for idx, fname in enumerate(JFYData):
        values[self.stats[idx]] = vals[fname] / JFYDivisors[idx]
    sst.data_update(values)

Then we go back to sleep for 30 seconds, and repeat.

An essential part of this project are the JSON metadata files that we provide to the Stats Store. Without these, the daemon does not know where the class, resources and statistics fit inside the namespace, nor does it know what units or description to provide when we run sstore info for any of these statistics.

All resources underneath a class must have the same statistics, and we need to decide on the resource namespace prior to adding the class to sstored. Here is the file class.app.solar.jfy.json, which in the service/jfy package I deliver to /usr/lib/sstore/metadata/json/site:

{
    "$schema": "//:class",
    "copyright": "Copyright (c) 2018, James C. McPherson. All rights reserved.",
    "description": "JFY Solar Inverter monitor",
    "id": "app/solar/jfy",
    "namespaces": [
        {
            "name-type": "string",
            "resource-name": "inverter"
        }
    ],
    "stability": "stable",
    "stat-names": [
        "//:stat.temperature",
        "//:stat.power-generated",
        "//:stat.voltage-dc",
        "//:stat.current",
        "//:stat.voltage-ac",
        "//:stat.energy-generated"
    ]
}

This is validated by the daemon using the schemas shipped in /usr/lib/sstore/metadata/json-schema, and comes with a companion file stat.app.solar.jfy.json. On your Solaris 11.4 system you can check these using the soljsonvalidate utility. (Note that it's not currently possible to get sstored to dynamically re-read metadata definitions, so a svcadm restart sstore is required.

The last component that we have is the WebUI sheet, which I have packaged so that it is delivered to /usr/lib/webui/analytics/sheets/site.

This is accessible to you once you have logged in to your BUI instance and selected the Solaris Analytics app from the menu.

Prior to accessing that sheet, however, you need to configure the service. On my system, the attached RS232 port is /dev/term/0, and I have a PVOutput.org API Key and System ID:

# svccfg -s jfy
svc:/application/jfy> listprop config
config           application
config/debug     boolean     false
config/logpath   astring     /var/jfy/log/
config/usesstore boolean     true
svc:/application/jfy> listprop devterm0
devterm0                 inverter
devterm0/devname         astring     /dev/term/0
devterm0/pvoutput_apikey astring     elided
devterm0/pvoutput_sysid  count       elided

To create your system's configuration, simply add in the appropriate definitions below. I suggest naming your inverter property group is a way that you find useful; the constraint is that it be of the type inverter:

svc:/application/jfy> addpg devterm0 inverter
svc:/application/jfy> setprop devterm0/devname = astring: "/dev/term/0"
svc:/application/jfy> setprop devterm0/pvoutput_apikey = astring: "your api key goes here"
svc:/application/jfy> setprop devterm0/pvoutput_sysid = count: yourSysIDgoesHere
svc:/application/jfy> refresh
svc:/application/jfy> quit
# svcadm enable jfy

Since I'm running the service with debugging enabled, I can see copious details in the output from

$ tail -f `svcs -L jfy`
[ 2018 Apr  4 06:46:22 Executing start method ("/lib/svc/method/svc-jfy start"). ]
args: ['/usr/lib/jfy/jfymonitor.py', '-F', '/var/jfy/cfg', '-l', '/var/jfy/log', '-d']

response b'\xa5\xa5\x00\x000\xbf\x101522130110183   \xfa\xcb\n\r'
response b'\xa5\xa5\x02\x010\xbe\x01\x06\xfd\xbe\n\r'
Registration succeeded for device with serial number 1522130110183 on /dev/term/0
Inverter map:
id   1: application
id   2: 1522130110183
{u'devterm0': {u'devname': u'/dev/term/0', u'pvoutput_sysid': u'elided', u'pvoutput_apikey': u'elided'}}
[ 2018 Apr  4 06:46:44 Method "start" exited with status 0. ]

And there you have it - a brief example of how to use the Python bindings for the Solaris Analytics feature.

If you have questions or comments about this post, please send me a message on Freenode, where I'm jmcp. Alternatively, add a comment to the github repo.




Seriously, firefox, what the heck?

Every now and again I see firefox taking up an entire core on my workstation. Today's thread of interest is #1:

$ pstack 19575
19575:  /usr/lib/firefox/firefox
------------  lwp# 1 / thread# 1  ---------------
 f21bf265 mmap     (1000000, fa9d97f0, 0, 806df3a, 0, 0) + 15
 0806df3a huge_palloc (8084b40, 34000000, 55600000, 807006d, e54bd608, bb03e498) + 64
 0807006d je_realloc (e54bd608, e35fc0a0, 19504, 80693c9) + 8ee
 080693c9 realloc  (cde84f80, cde84f81, 6020e100, f159bcc2) + 45
 f159bcc2 _ZN2js9MarkStack7enlargeEj (76d29960, e54bd608, fa9d9968, f15a1308) + 58
 f15a1308 _ZN2js8GCMarker8traverseIP8JSObjectEEvT_ (e54bd608, e54bd608, fa9d99b8, f15aa837, e54bd608, 76d29960) + 42
 f15aa837 _Z9DoMarkingI8JSObjectEvPN2js8GCMarkerEPT_ (f16b8000, dc54f064, fa9d99f8, f15b710d) + 39
 f15b710d _Z9DoMarkingIN2JS5ValueEEvPN2js8GCMarkerERKT_ (f16b8000, fa9d9ab0, fa9d9a28, f15b718e) + 42
 f15b718e _Z16DispatchToTracerIN2JS5ValueEEvP8JSTracerPT_PKc (e54bd608, fa9d9ab0, ee3e9323, f15b71e2) + 38
 f15b71e2 _ZN2js9TraceEdgeIN2JS5ValueEEEvP8JSTracerPNS_18WriteBarrieredBaseIT_EEPKc (e54bd608, e350fee0, ee30859c, f10d79e8, e54bd608, 3a7715a0) + 1e
 f10d79e8 _ZN2js9MapObject4markEP8JSTracerP8JSObject (fa9d9de8, e54bd608, fa9d9b78, f15a4883) + 1c0
 f15a4883 _ZN2js8GCMarker14drainMarkStackERNS_11SliceBudgetE (fa9d9b98, 0, 4378ac, f13376b7, e54bd608, fa9d9de8) + 4b3
 f13376b7 _ZN2js2gc9GCRuntime14drainMarkStackERNS_11SliceBudgetENS_7gcstats5PhaseE (f148e7e5, fa9d9c1c, e54bb4f0, f1350f5b) + 3f
 f1350f5b _ZN2js2gc9GCRuntime23incrementalCollectSliceERNS_11SliceBudgetEN2JS8gcreason6ReasonERNS_26AutoLockForExclusiveAccessE (568bb, 3e8, 5, f1351bff, e54bb4f0, fa9d9de8) + 26d
 f1351bff _ZN2js2gc9GCRuntime7gcCycleEbRNS_11SliceBudgetEN2JS8gcreason6ReasonE (fa9d9dc8, f1351e33, 5abff385, f1351e9e, e54bb4f0, 0) + 25d
 f1351e9e _ZN2js2gc9GCRuntime7collectEbNS_11SliceBudgetEN2JS8gcreason6ReasonE (f16b8000, f1b24820, fa9d9e00, f135223e, e54bb4f0, 0) + 16e
 f135223e _ZN2js2gc9GCRuntime7startGCE18JSGCInvocationKindN2JS8gcreason6ReasonEx (0, 0, fa9d9ea0, f13523b7, e54bb4f0, 0) + 8c
 f13523b7 _ZN2js2gc9GCRuntime13gcIfRequestedEv (e2f2b000, fa9d9ed8, fa9d9ee0, f1457b6b) + 55
 f1457b6b _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9da250, fa9ddead, fa9d9f60, f1457c03) + 292
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e54bb000, e2f26c38, ffffff8c, f1457c7c, fa9da0a4, e4a253d8) + 87
 f1457c7c _ZN2js4CallEP9JSContextN2JS6HandleINS2_5ValueEEES5_RKNS_13AnyInvokeArgsENS2_13MutableHandleIS4_EE (e4a25280, f1b24818, fa9da0a4, f13b4fb4, e54bb000, fa9da0c8) + 46
 f13b4fb4 _ZNK2js7Wrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (fa9da250, fa9da518, 1, f13a8784) + 200
 f13a8784 _ZNK2js23CrossCompartmentWrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (fa9da278, fa9da260, fa9da1a8, f13a6de3) + 104
 f13a6de3 _ZN2js5Proxy4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (e2d59c70, e54bb000, fa9da2d8, f13a7809, e54bb000, fa9da1f0) + eb
 f13a7809 _ZN2js10proxy_CallEP9JSContextjPN2JS5ValueE (4, e54bb000, 0, f14579ba) + 60
 f14579ba _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9da660, 6eabb3c0, f1b24834, f1457c03) + e1
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (651d54b0) + 87
 f1457c2d _ZN2js13CallFromStackEP9JSContextRKN2JS8CallArgsE (0, 0, 0, f1652df7) + 1b
 f1652df7 _ZN2js3jitL14DoCallFallbackEP9JSContextPNS0_13BaselineFrameEPNS0_15ICCall_FallbackEjPN2JS5ValueENS7_13MutableHandleIS8_EE (0, f1b24828, 0, 2a74025f, e54bb000, fa9da578) + 4a7
 2a74025f ???????? (2acf6833, 8021, 7b460c70, ffffff8c, 7b460c60, ffffff8c)
 651d54d0 ???????? (3444, e2d5f940, 4, d9d5d910, ffffff8c, 62e048c0)
 2a73f909 ???????? (2acf6370, 5, fa9daa18, 0, e2d5f940, 0)
 f162d642 _ZL13EnterBaselineP9JSContextRN2js3jit12EnterJitDataE (e54bb000, fa9da718, fa9da880, f1630993, e35cf448, f19105ac) + 15b
 f1630993 _ZN2js3jit19EnterBaselineMethodEP9JSContextRNS_8RunStateE (e54bb000, fa9da7dc, fa9da880, f1457837) + 10b
 f1457837 _ZN2js9RunScriptEP9JSContextRNS_8RunStateE (e2f2b000, fa9da868, fa9da870, f1457b3c, e54bb000, fa9da880) + 2d7
 f1457b3c _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (e54bb000, fa9ddead, fa9da8f0, f1457c03) + 263
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e54bb000, e31e9438, ffffff8c, f1457c7c, fa9daa34, e4a252e8) + 87
 f1457c7c _ZN2js4CallEP9JSContextN2JS6HandleINS2_5ValueEEES5_RKNS_13AnyInvokeArgsENS2_13MutableHandleIS4_EE (e4a25280, f1b24818, fa9daa34, f13b4fb4, e54bb000, fa9daa58) + 46
 f13b4fb4 _ZNK2js7Wrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (f1b24800, e2d07190, fa9daa98, f13a8784) + 200
 f13a8784 _ZNK2js23CrossCompartmentWrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (e487ac10, fa9dab8c, f257263e, f13a6de3) + 104
 f13a6de3 _ZN2js5Proxy4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (e2d5f940, e54bb000, fa9dac68, f13a7809, e54bb000, fa9dab80) + eb
 f13a7809 _ZN2js10proxy_CallEP9JSContextjPN2JS5ValueE (4, e54bb000, 0, f14579ba) + 60
 f14579ba _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9dafb0, 79c58750, f1b24834, f1457c03) + e1
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e54be37c) + 87
 f1457c2d _ZN2js13CallFromStackEP9JSContextRKN2JS8CallArgsE (0, 0, 0, f1652df7) + 1b
 f1652df7 _ZN2js3jitL14DoCallFallbackEP9JSContextPNS0_13BaselineFrameEPNS0_15ICCall_FallbackEjPN2JS5ValueENS7_13MutableHandleIS8_EE (fa9daed8, f1b24820, fa9db1d8, 2a74025f, e54bb000, fa9daed8) + 4a7
 2a74025f ???????? (2acf59f4, 5821, 7ae939a0, ffffff8c, 52cd2840, ffffff8c)
 648bf320 ???????? (2444, d9d40a80, 1, 0, ffffff82, 7b858a60)
 2a73f909 ???????? (2acf5680, 2, fa9db368, 0, d9d40a80, 0)
 f162d642 _ZL13EnterBaselineP9JSContextRN2js3jit12EnterJitDataE (e54bb000, fa9db068, fa9db1d0, f1630993, e2d104f0, f16e0134) + 15b
 f1630993 _ZN2js3jit19EnterBaselineMethodEP9JSContextRNS_8RunStateE (e54bb000, fa9db12c, fa9db1d0, f1457837) + 10b
 f1457837 _ZN2js9RunScriptEP9JSContextRNS_8RunStateE (d6ec4800, f16b8000, fa9db1c0, f1457b3c, e54bb000, fa9db1d0) + 2d7
 f1457b3c _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9db3ac, 238be150, fa9db240, f1457c03) + 263
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e54bb000, e2f2b038, ffffff82, f1457c7c, fa9db384, e4a25208) + 87
 f1457c7c _ZN2js4CallEP9JSContextN2JS6HandleINS2_5ValueEEES5_RKNS_13AnyInvokeArgsENS2_13MutableHandleIS4_EE (e4a24fb8, f1b24818, fa9db384, f13b4fb4, e54bb000, fa9db3a8) + 46
 f13b4fb4 _ZNK2js7Wrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (fa9db5ec, fa9db8a8, fa9db400, f13a8784) + 200
 f13a8784 _ZNK2js23CrossCompartmentWrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (e4a25088, 84, fa9db470, f13a6de3) + 104
 f13a6de3 _ZN2js5Proxy4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (d9d40a80, e54bb000, fa9db4d0, f13a7809, e54bb000, fa9db4d0) + eb
 f13a7809 _ZN2js10proxy_CallEP9JSContextjPN2JS5ValueE (e4a25088, ffffff8c, fa9db520, f14579ba) + 60
 f14579ba _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9db71c, 1, fa9db608, f1457c03) + e1
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (d9d40a80, fa9db720, ffffff82, f1457c7c, fa9db5fc, fa9db5f0) + 87
 f1457c7c _ZN2js4CallEP9JSContextN2JS6HandleINS2_5ValueEEES5_RKNS_13AnyInvokeArgsENS2_13MutableHandleIS4_EE (e54bb000, 29, fa9db738, f145c065, e54bb000, fa9db7d8) + 46
 f145c065 _ZN2js19SpreadCallOperationEP9JSContextN2JS6HandleIP8JSScriptEEPhNS3_INS2_5ValueEEES9_S9_S9_NS2_13MutableHandleIS8_EE (e54bb000, fa9db760, 54df14a0, f1652875, e54bb000, fa9db7fc) + 427
 f1652875 _ZN2js3jitL20DoSpreadCallFallbackEP9JSContextPNS0_13BaselineFrameEPNS0_15ICCall_FallbackEPN2JS5ValueENS7_13MutableHandleIS8_EE (f1b24820, fa9db834, e35d5d00, 2a7401e8, e54bb000, fa9db8d8) + 237
 2a7401e8 ???????? (2ae0c852, 4021, 7b858a90, ffffff8c, 0, ffffff82)
 54df14a0 ???????? (2042, d9d3f0d0, 1, 0, ffffff82, 7b858a90)
 2ab50cd4 ???????? (2acf0414, 4821, 7b858af0, ffffff8c, 0, ffffff82)
 5ddfe7a8 ???????? (2042, e2d5b080, 2, 0, ffffff82, d9d3f0d0)
 2ab50cd4 ???????? (2acd3bc4, 4821, 7b858b50, ffffff8c, 0, ffffff82)
 51417888 ???????? (3444, e2d5b060, 3, 0, ffffff82, d9d36c60)
 2a73f909 ???????? (2acd34d0, 4, fa9dbe78, 0, e2d5b060, 0)
 f162d642 _ZL13EnterBaselineP9JSContextRN2js3jit12EnterJitDataE (e54bb000, fa9dbb78, fa9dbce0, f1630993, 2422, 1) + 15b
 f1630993 _ZN2js3jit19EnterBaselineMethodEP9JSContextRNS_8RunStateE (e54bb000, fa9dbc3c, fa9dbce0, f1457837) + 10b
 f1457837 _ZN2js9RunScriptEP9JSContextRNS_8RunStateE (416b7318, 416b6f10, fa9dbcd0, f1457b3c, e54bb000, fa9dbce0) + 2d7
 f1457b3c _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9dbd50, e4a24728, 416b6c68, f1457c03) + 263
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e54bb000, e31eb038, ffffff82, f1457c7c, fa9dbe9c, 1) + 87
 f1457c7c _ZN2js4CallEP9JSContextN2JS6HandleINS2_5ValueEEES5_RKNS_13AnyInvokeArgsENS2_13MutableHandleIS4_EE (8daff0e4, f1b24818, fa9dbe94, f13b4fb4, e54bb000, fa9dbeb8) + 46
 f13b4fb4 _ZNK2js7Wrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (6, 416b6c68, 416b7258, f13a8784) + 200
 f13a8784 _ZNK2js23CrossCompartmentWrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (f2232a40, 0, fa9dbf98, f13a6de3) + 104
 f13a6de3 _ZN2js5Proxy4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (e2d5b060, 1010000, 10000, f13a7809, e54bb000, fa9dbfe0) + eb
 f13a7809 _ZN2js10proxy_CallEP9JSContextjPN2JS5ValueE (e7b08be0, ec112a00, 0, f14579ba) + 60
 f14579ba _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9dc22c, 3, fa9dc118, f1457c03) + e1
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e2d5b060, 0, ffffff82, f1457c7c, f16b8000, fa9dc100) + 87
 f1457c7c _ZN2js4CallEP9JSContextN2JS6HandleINS2_5ValueEEES5_RKNS_13AnyInvokeArgsENS2_13MutableHandleIS4_EE (e54bb000, 29, fa9dc248, f145c065, e54bb000, fa9dc2e8) + 46
 f145c065 _ZN2js19SpreadCallOperationEP9JSContextN2JS6HandleIP8JSScriptEEPhNS3_INS2_5ValueEEES9_S9_S9_NS2_13MutableHandleIS8_EE (e54bb000, fa9dc270, 50a854c8, f1652875, e54bb000, fa9dc30c) + 427
 f1652875 _ZN2js3jitL20DoSpreadCallFallbackEP9JSContextPNS0_13BaselineFrameEPNS0_15ICCall_FallbackEPN2JS5ValueENS7_13MutableHandleIS8_EE (2, ffffff81, e4a24f18, 2a7401e8, e54bb000, fa9dc3f8) + 237
 2a7401e8 ???????? (2acd0acc, 4821, 6a5121a0, ffffff8c, 0, ffffff82)
 50a854c8 ???????? (2444, e2d5f820, 2, d9d5d910, ffffff8c, 7b858bb0)
 2a73f909 ???????? (2acd05c0, 3, fa9dc888, 0, e2d5f820, 0)
 f162d642 _ZL13EnterBaselineP9JSContextRN2js3jit12EnterJitDataE (e54bb000, fa9dc588, fa9dc6f0, f1630993, e35cf448, f19105ac) + 15b
 f1630993 _ZN2js3jit19EnterBaselineMethodEP9JSContextRNS_8RunStateE (e54bb000, fa9dc64c, fa9dc6f0, f1457837) + 10b
 f1457837 _ZN2js9RunScriptEP9JSContextRNS_8RunStateE (e2f2d400, fa9dc6d0, fa9dc6e0, f1457b3c, e54bb000, fa9dc6f0) + 2d7
 f1457b3c _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9dc760, 7b8a36b0, fa9dc760, f1457c03) + 263
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e54bb000, e31e9438, ffffff8c, f1457c7c, fa9dc8a4, e4a24de0) + 87
 f1457c7c _ZN2js4CallEP9JSContextN2JS6HandleINS2_5ValueEEES5_RKNS_13AnyInvokeArgsENS2_13MutableHandleIS4_EE (e4a246c8, f1b24818, fa9dc8a4, f13b4fb4, e54bb000, fa9dc8c8) + 46
 f13b4fb4 _ZNK2js7Wrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (461fe100, e20761c8, fa9dccc8, f13a8784) + 200
 f13a8784 _ZNK2js23CrossCompartmentWrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (fa9dccf0, e54bb000, fa9dc990, f13a6de3) + 104
 f13a6de3 _ZN2js5Proxy4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (e2d5f820, e54bb000, fa9dcad8, f13a7809, e54bb000, fa9dc9f0) + eb
 f13a7809 _ZN2js10proxy_CallEP9JSContextjPN2JS5ValueE (2, e54bb000, 0, f14579ba) + 60
 f14579ba _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9dce20, 7c32cd30, f1b24834, f1457c03) + e1
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e54bb000) + 87
 f1457c2d _ZN2js13CallFromStackEP9JSContextRKN2JS8CallArgsE (0, 0, 0, f1652df7) + 1b
 f1652df7 _ZN2js3jitL14DoCallFallbackEP9JSContextPNS0_13BaselineFrameEPNS0_15ICCall_FallbackEjPN2JS5ValueENS7_13MutableHandleIS8_EE (fa9dcd48, f1b24820, fa9dd048, 2a74025f, e54bb000, fa9dcd48) + 4a7
 2a74025f ???????? (2ac26a8e, 6821, 6a5121f0, ffffff8c, d9d36300, ffffff8c)
 248a0b88 ???????? (2444, d9d3f100, 1, 0, ffffff82, 7b460ca0)
 2a73f909 ???????? (2ac26150, 2, fa9dd1d8, 0, d9d3f100, 0)
 f162d642 _ZL13EnterBaselineP9JSContextRN2js3jit12EnterJitDataE (e54bb000, fa9dced8, fa9dd040, f1630993, e2d10748, f19105ac) + 15b
 f1630993 _ZN2js3jit19EnterBaselineMethodEP9JSContextRNS_8RunStateE (e54bb000, fa9dcf9c, fa9dd040, f1457837) + 10b
 f1457837 _ZN2js9RunScriptEP9JSContextRNS_8RunStateE (d6836000, e54bb000, fa9dd030, f1457b3c, e54bb000, fa9dd040) + 2d7
 f1457b3c _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9dd1b8, f1495b92, fa9dd0b0, f1457c03) + 263
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e54bb000, e2f2d438, ffffff82, f1457c7c, fa9dd1f4, e4a246a0) + 87
 f1457c7c _ZN2js4CallEP9JSContextN2JS6HandleINS2_5ValueEEES5_RKNS_13AnyInvokeArgsENS2_13MutableHandleIS4_EE (e4a24560, f1b24818, fa9dd1f4, f13b4fb4, e54bb000, fa9dd218) + 46
 f13b4fb4 _ZNK2js7Wrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (461fe100, e489ef38, fa9dd29c, f13a8784) + 200
 f13a8784 _ZNK2js23CrossCompartmentWrapper4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (f16b8000, fa9dd4e4, f1914040, f13a6de3) + 104
 f13a6de3 _ZN2js5Proxy4callEP9JSContextN2JS6HandleIP8JSObjectEERKNS3_8CallArgsE (d9d3f100, e54bb000, fa9dd428, f13a7809, e54bb000, fa9dd340) + eb
 f13a7809 _ZN2js10proxy_CallEP9JSContextjPN2JS5ValueE (1, e54bb000, 0, f14579ba) + 60
 f14579ba _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (fa9dd900, d9d365f0, f1b24834, f1457c03) + e1
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (e54bb000) + 87
 f1457c2d _ZN2js13CallFromStackEP9JSContextRKN2JS8CallArgsE (0, 0, 0, f1652df7) + 1b
 f1652df7 _ZN2js3jitL14DoCallFallbackEP9JSContextPNS0_13BaselineFrameEPNS0_15ICCall_FallbackEjPN2JS5ValueENS7_13MutableHandleIS8_EE (e4a24450, ffffff8c, 2, 2a74025f, e54bb000, fa9dd728) + 4a7
 2a74025f ???????? (2ac2ab3e, f821, 7b460cd0, ffffff8c, 0, ffffff82)
 3c0c3208 ???????? (3045, d9d629a0, 3, d9d36520, ffffff8c, 52cd2140)
 2a73f774 ???????? (3042, d9d629a0, 3, d9d36520, ffffff8c, 52cd2140)
 2aad1e94 ???????? (2ae556d3, 6821, dd48e9d0, ffffff85, 76d68340, ffffff8c)
 47f1f428 ???????? (3444, d9d62820, 3, d9d36520, ffffff8c, 52cd2140)
 2a73f909 ???????? (2ae552e0, 4, fa9ddcd8, 0, d9d62820, 0)
 f162d642 _ZL13EnterBaselineP9JSContextRN2js3jit12EnterJitDataE (e54bb000, fa9dd9b8, fa9ddb20, f1630993, f16b8000, fa9ddaac) + 15b
 f1630993 _ZN2js3jit19EnterBaselineMethodEP9JSContextRNS_8RunStateE (e54bb000, fa9dda7c, fa9ddb20, f1457837) + 10b
 f1457837 _ZN2js9RunScriptEP9JSContextRNS_8RunStateE (fa9ddbe0, d9b37ee0, f16b8000, f1457b3c, e54bb000, fa9ddb20) + 2d7
 f1457b3c _ZN2js23InternalCallOrConstructEP9JSContextRKN2JS8CallArgsENS_14MaybeConstructE (e54bb4f0, e54bb000, fa9ddb90, f1457c03) + 263
 f1457c03 _ZL12InternalCallP9JSContextRKN2js13AnyInvokeArgsE (fa9ddedc, fa9ddd1c, ffffff8c, f1457c7c, e54bb000, 52cd2140) + 87
 f1457c7c _ZN2js4CallEP9JSContextN2JS6HandleINS2_5ValueEEES5_RKNS_13AnyInvokeArgsENS2_13MutableHandleIS4_EE (fa9ddcd8, fa9ddcd8, fa9ddc28, f1326a7c, e54bb000, fa9ddee8) + 46
 f1326a7c _Z20JS_CallFunctionValueP9JSContextN2JS6HandleIP8JSObjectEENS2_INS1_5ValueEEERKNS1_16HandleValueArrayENS1_13MutableHandleIS6_EE (f16b8000, e5474150, fa9ddfd8, eed2a425, e54bb000, fa9dded0) + 217
 eed2a425 _ZN19nsXPCWrappedJSClass10CallMethodEP14nsXPCWrappedJStPK19XPTMethodDescriptorP17nsXPTCMiniVariant (fa9de000, f16b8000, fa9de018, eed2aee3, e3943760, d6d8d0c0) + b93
 eed2aee3 _ZN14nsXPCWrappedJS10CallMethodEtPK19XPTMethodDescriptorP17nsXPTCMiniVariant (fa9de0f0, 0, 2, ee52253e, d6d8d0c0, 3) + b3
 ee52253e PrepareAndDispatch (45b42e00, fa9de140, 0, ee4ddfd5, d688f510, 45d2082c) + 126
 ee4ddfd5 _ZN14nsObserverList15NotifyObserversEP11nsISupportsPKcPKDs (fa9de1a0, f16b8000, fa9de1c8, ee4de0ec) + 79
 ee4de0ec _ZN17nsObserverService15NotifyObserversEP11nsISupportsPKcPKDs (616d692f, 2f736567, fa9de200, ee844efc, e7b079a0, 45d2082c) + ec
 ee844efc _ZN7mozilla3net13nsHttpHandler15NotifyObserversEP14nsIHttpChannelPKc (45d20800, fa9de240, fa9de468, ee8c13b1, de96a000, 45d2082c) + aa
 ee8c13b1 _ZN7mozilla3net13nsHttpChannel12BeginConnectEv (f16b8000, fa9de490, fa9de4b8, ee8c19f2, 45d20800, fa9de560) + b79
 ee8c19f2 _ZN7mozilla3net13nsHttpChannel16OnProxyAvailableEP13nsICancelableP10nsIChannelP12nsIProxyInfo8nsresult (fa9de4e0, 0, fa9de588, ee5b6815, 45d20800, 80612338) + 14c
 ee5b6815 _ZN7mozilla3net21nsAsyncResolveRequest10DoCallbackEv (80612330, fa9de5b0, fa9de5c8, ee5b6c24) + 2cd
 ee5b6c24 _ZN7mozilla3net21nsAsyncResolveRequest15OnQueryCompleteE8nsresultRK9nsCStringS5_ (f1bac580, 1, fa9de5f8, ee5b45bd, 80612330, 80040111) + 48
 ee5b45bd _ZN7mozilla3net15ExecuteCallback3RunEv (f1b6a2a0, 0, fa9de640, ee512380) + 25
 ee512380 _ZN8nsThread16ProcessNextEventEbPb (f1b42400, e7b07830, fa9de6b8, ee53cb1c) + 292
 ee53cb1c _Z19NS_ProcessNextEventP9nsIThreadb (f1b42400, fa9de6e0, fa9de718, ee92e6dd, f1b6a2a0, 0) + 35
 ee92e6dd _ZN7mozilla3ipc11MessagePump3RunEPN4base11MessagePump8DelegateE (f16b8000, fa9de9e0, fa9de768, ee90b592) + 127
 ee90b592 _ZN11MessageLoop11RunInternalEv (8, 0, fa9de780, ee90b891, f1b42400, ee90b888) + 1c
 ee90b891 _ZN11MessageLoop3RunEv (fa9de7b0, fa9de7b0) + 27
 f01ce720 _ZN14nsBaseAppShell3RunEv (f1b41400, 0, 0, f0962787) + 34
 f0962787 _ZN12nsAppStartup3RunEv (e7b079a0, fa9de820, fa9de938, f09de60d, e46183a0, f1b17420) + 2d
 f09de60d _ZN7XREMain11XRE_mainRunEv (fa9de960, f16b8000, fa9de998, f09de929, fa9de9e0, fa9de97b) + bc3
 f09de929 _ZN7XREMain8XRE_mainEiPPcPK12nsXREAppData (fa9de9c0, f16b8000, fa9deb18, f09debdd, fa9de9e0, 6) + 1e7
 f09debdd XRE_main (fa9defa4, 8057e44, 7, 805a075, 6, fa9df084) + 12f
 0805a075 _ZL7do_mainiPPcS0_P7nsIFile (fa9df038, 805a185, 6, 805a19d, f1b6f060, 69671e92) + 377
 0805a19d main     (fa9df1b4, fa9df054, 8059921) + ba
 08059886 _start   (6, fa9df238, fa9df262, fa9df265, 0, 0) + 46



I'm embarrassed to admit it

Let's file this under "embarrassing misses which I hope make me a better engineer": I've been beating up getting the Solaris Userland build of Python Cryptography updated from the somewhat-old v1.7.2 to the current v2.1.4. The first thing I realised is that we needed a new version of cffi. Pretty easy to update, though that mean I had to update xattr too (all updates, like bugs, have friends).

After migrating cffi from building with Studio to gcc and mucking around with patches I finally had something could install in a crash-n-burn kz in our cloud. Yay, or so I thought.

When I tried to run pkg verify in that kz with the new packages, the packaging system died in a heap with an error that I distilled down to this:

$ python2.7   
Python 2.7.14 (default, Jan 31 2018, 05:35:05) [C] on sunos5
Type "help", "copyright", "credits" or "license" for more information.
>>> from cryptography.hazmat.bindings._openssl import ffi, lib
>>> from cryptography.x509 import certificate_transparency
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/vendor-packages/cryptography/x509/__init__.py", line 7, in <module>
    from cryptography.x509 import certificate_transparency
ImportError: cannot import name certificate_transparency

What on earth does that mean? A week or so of googling and delving deep into the code left me no wiser. I came across one "solution" on stackoverflow which said to "merely" update pip from v7 to v9 and "that solved everything". That's not what I think of as a solution. It might be sufficient for an end-user, but I'm not in that position; I'm the bloke who's doing the release engineering aspect to make sure that the end-user doesn't have a problem.

After finally realising that there's a Python Cryptography channel (#cryptography-dev) on irc.freenode.net I wandered in and asked for help. Alex asked if I got an exception from

from cryptography.hazmat.bindings._openssl import ffi, lib

which I didn't, but a subsequent from cryptography.x509 import certificate_transparency got me the same ImportError I noted above. THAT got me wondering if cffi was built correctly. As it happens, it wasn't (all my fault, yes):

>>> import cffi
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/vendor-packages/cffi/__init__.py", line 4, in <module>
from .api import FFI
  File "/usr/lib/python2.7/vendor-packages/cffi/api.py", line 3, in <module>
    from .error import CDefError
ImportError: No module named error

In short order this lead me to copy the bits from my build system's proto area directly to the crash-n-burn kz for both cffi and cryptography, and boom I could import certificate_transparency.

You can see where this is going - a quick check of the proto areas for files which weren't included in each package manifest, updating the p5m (along with pkgfmt) and a gmake publish:

$ cd build/prototype/`uname -p`
$  for f in `find usr  -name \*py -o -name \*h -o -name \*so | \
   sed -e"s,python[23].[745],python\$\(PYVER),g" |sort |uniq | \
   grep -v  cpython`; do \
        grep -q "$f" ../../../$P5M || \
        echo "file path=$f" >> ../../../$P5M; 
   done
$ cd ../../..
$ pkgfmt $P5M

I'm kicking myself that I forgot to include that tiny step in my efforts over the last week, but I won't in future.




Coot-tha, yay!

The last few years haven't been particularly good for my fitness, with a combination of pretty hectic workload, increasing school activities for the children and recovering from my left knee meniscus repair in 2016. So last year's cycling effort of 2100km (achieved almost completely in the last 3 months of the year) was really nice. Really, really nice.

I've started off this year a little better - I've exceeded my 150km weekly goal each week except for the first week of the year, and I'm at 1245km ridden out of my 4000km goal for the whole year. Yay me!

Last week I was really stoked to pull out a 102km effort (in 4h5m), and late yesterday figured that I really should have a crack at Mt Coot-tha again. The last time I rode it (rather than walked!) was on 7 March 2014, and at that point I'd ridden only about 200km for the year - so the effort involved was quite large.

Today.... yes, the effort involved was still quite large (and I'm feeling a bit more knackered than after last week's 102km, to be honest). On the positive side, however, I rode 71.3km in 3h7m compared to nearly-four-years-ago's 70.1km in 3h21m. I managed to get up both of the climbs that I'm interested in, and do so in personal best times so I'm pretty happy.

Here they are, along with the overall tracks and summaries from 2014 and today: