Thursday, 19 October 2017

Zipping and Tarring on macOS - with added funkiness

So I had a specific requirement yesterday - I wanted to extract three specific files from a ZIP file.

This is what I had: -

unzip -l certificate-bundle.zip

Archive:  certificate-bundle.zip
  Length      Date    Time    Name
---------  ---------- -----   ----
        0  10-19-2017 16:58   ca/
     1310  10-19-2017 16:58   ca/ca.crt
     1679  10-19-2017 16:58   ca/ca.key
        0  10-19-2017 16:58   node1/
     1379  10-19-2017 16:58   node1/node1.crt
     1679  10-19-2017 16:58   node1/node1.key

---------                     -------
     6047                     6 files


So I wanted to extract the certificates and one of the keys …. and place them into specific locations

BUT…..

I didn't want the paths, just the files.

Whilst zip supports this: -

-j  junk paths (do not make directories) 

alas, unzip does not.

Thankfully, the internet had the answer: -

How do I exclude absolute paths for Tar?

I knew that I could use tar on a ZIP file, but this was a nuance.

So here're the commands that I used: -

tar xvzf ~/certificate-bundle.zip --strip-components=1 -C ~/Desktop/elasticsearch-config/x-pack ca/ca.crt
tar xvzf ~/certificate-bundle.zip --strip-components=1 -C ~/Desktop/elasticsearch-config/x-pack node1/node1.crt
tar xvzf ~/certificate-bundle.zip --strip-components=1 -C ~/Desktop/elasticsearch-config/x-pack node1/node1.key


so we use —strip-components to remove the path and -C to place the files into specific locations.

So that's all good then :-)



IBM BPM and Elasticsearch - with added TLS

Following this: -



I've been tinkering further with Elasticsearch on Docker, establishing a TLS connection between it and IBM BPM.

Here's my notes: -

Pull Image


Start container

es=`docker run -d -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:5.6.3`

Check logs

docker logs $es -f

Upload YAML for Certgen

docker cp ~/instances.yml $es:/usr/share/elasticsearch/config

Generate Self-Signed Certificate, plus Keys

docker exec -i -t $es /bin/bash -c "/usr/share/elasticsearch/bin/x-pack/certgen -in /usr/share/elasticsearch/config/instances.yml -out /usr/share/elasticsearch/certificate-bundle.zip"

Download Certificates

docker cp $es:/usr/share/elasticsearch/certificate-bundle.zip ~

Stop Container

docker stop $es

Remove Container

docker rm $es

Extract and place certificates and key

tar xvzf ~/certificate-bundle.zip --strip-components=1 -C ~/Desktop/elasticsearch-config/x-pack ca/ca.crt

tar xvzf ~/certificate-bundle.zip --strip-components=1 -C ~/Desktop/elasticsearch-config/x-pack node1/node1.crt

tar xvzf ~/certificate-bundle.zip --strip-components=1 -C ~/Desktop/elasticsearch-config/x-pack node1/node1.key

Re-start container

Note; we're mapping ~/Desktop/elasticsearch-config as the ES config root

es=`docker run -d -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" -v /Users/davidhay/Desktop/elasticsearch-config:/usr/share/elasticsearch/config docker.elastic.co/elasticsearch/elasticsearch:5.6.3`

Check logs

docker logs $es -f

Test using Curl - on host

curl --insecure https://localhost:9200 -u elastic:changeme

Should return: -

{
  "name" : "-2S40f4",
  "cluster_name" : "docker-cluster",
  "cluster_uuid" : "zV8P1a4FR26Q_J_h1E0QKA",
  "version" : {
    "number" : "5.6.3",
    "build_hash" : "1a2f265",
    "build_date" : "2017-10-06T20:33:39.012Z",
    "build_snapshot" : false,
    "lucene_version" : "6.6.1"
  },
  "tagline" : "You Know, for Search"
}

or similar

Test using browser

Default credentials are elastic/changeme


Should return same JSON

Test on BPM box

Hostname node1.uk.ibm.com aliased to IP address of host Mac

curl --insecure https://node1.uk.ibm.com:9200 -u elastic:changeme

{
  "name" : "-2S40f4",
  "cluster_name" : "docker-cluster",
  "cluster_uuid" : "zV8P1a4FR26Q_J_h1E0QKA",
  "version" : {
    "number" : "5.6.3",
    "build_hash" : "1a2f265",
    "build_date" : "2017-10-06T20:33:39.012Z",
    "build_snapshot" : false,
    "lucene_version" : "6.6.1"
  },
  "tagline" : "You Know, for Search"
}

or similar

Place CA certificate on BPM box

scp ~/Desktop/elasticsearch-config/x-pack/ca.crt wasadmin@bpm86:~

Update BPM Event Emitter YAML files

vi /opt/ibm/WebSphereProfiles/Dmgr01/config/cells/PCCell1/nodes/Node1/servers/SupClusterMember1/analytics/config/BPMEventEmitter.yml

vi /opt/ibm/WebSphereProfiles/Dmgr01/config/cells/PCCell1/clusters/SupCluster/analytics/config/BPMEventEmitter.yml

ES configuration as follows: -

...
esConfiguration:
    enabled: true
    # The Elasticsearch index name
    index: bpm-events
    # Enable the following properties when Elasticsearch security is on.
    username: elastic
    password: changeme
    httpsTrustType: CRT
    trustFileLocation: /home/wasadmin/ca.crt
    hostnameVerifier: false
    esTaskIndex: restore_task_index
...

Synchronise Node

/opt/ibm/WebSphereProfiles/Dmgr01/bin/wsadmin.sh -lang jython -f fullSync.jy

Validate Sync

ls -al `find /opt/ibm/WebSphereProfiles -name BPMEventEmitter.yml`

-rw-r--r-- 1 wasadmin wasadmins 2793 Oct 19 16:54 /opt/ibm/WebSphereProfiles/AppSrv01/config/cells/PCCell1/clusters/SupCluster/analytics/config/BPMEventEmitter.yml
-rw-r--r-- 1 wasadmin wasadmins 2793 Oct 19 16:54 /opt/ibm/WebSphereProfiles/AppSrv01/config/cells/PCCell1/nodes/Node1/servers/SupClusterMember1/analytics/config/BPMEventEmitter.yml
-rw-r--r-- 1 wasadmin wasadmins 2762 Sep 18 08:51 /opt/ibm/WebSphereProfiles/AppSrv01/installedApps/PCCell1/BPMEventEmitter_war_De1.ear/BPMEventEmitter.war/WEB-INF/classes/BPMEventEmitter.yml
-rw-r--r-- 1 wasadmin wasadmins 2797 Oct 19 17:19 /opt/ibm/WebSphereProfiles/Dmgr01/config/cells/PCCell1/clusters/SupCluster/analytics/config/BPMEventEmitter.yml
-rw-r--r-- 1 wasadmin wasadmins 2797 Oct 19 17:19 /opt/ibm/WebSphereProfiles/Dmgr01/config/cells/PCCell1/nodes/Node1/servers/SupClusterMember1/analytics/config/BPMEventEmitter.yml

All but BPMEventEmitter_war_De1.ear version of file should be the same size/date/time

Start App

/opt/ibm/WebSphereProfiles/Dmgr01/bin/wsadmin.sh -lang jython

AdminControl.invoke('WebSphere:name=ApplicationManager,process=SupClusterMember1,platform=proxy,node=Node1,version=8.5.5.12,type=ApplicationManager,mbeanIdentifier=ApplicationManager,cell=PCCell1,spec=1.0', 'startApplication', '[BPMEventEmitter_war_De1]')

quit

Check Logs

tail -f /opt/ibm/WebSphereProfiles/AppSrv01/logs/SupClusterMember1/SystemOut.log

Note

If you see this: -

Caused by: javax.net.ssl.SSLPeerUnverifiedException: Host name '9.174.27.153' does not match the certificate subject provided by the peer (CN=node1, DC=uk, DC=ibm, DC=com)

use: -

hostnameVerifier: false

in BPMEventEmitter.yml

Backup





Monday, 16 October 2017

Apple Watch - go, no go, go

So I had a weird experience last evening, and not in a good way.

For no apparent reason, this was my Apple Watch: -


and this: -


I have no earthly idea what happened.

So, being a true nerd, and a big fan of The IT Crowd, I decided to ( all together now ) TURN IT OFF AND ON AGAIN ….

Obviously I couldn't read the display, what with it being all garbled n' all, so I just hit the big button on the right-hand side, below the digital crown and chose the appropriate gibberish - it was the one in red, so it must've been the right one ? Right ?

WRONG !!

The next, my Apple Watch has called 999 ( the UK's emergency services number, similar to 911 in the USA ), and I'm talking to an operator, who's asking how he can help.

When I don't immediately respond ( panic has set in at this point ), he's saying "If you're unable to speak, please press a digit on your phone's dial" etc. assuming, for good reason, that I am injured and cannot respond :-(

I manage to find my voice, and tell him that all is well, and apologise profusely for wasting his time and our public resources ….

Then the house phone rings … and my beloved gets a recorded message telling her that Dave Hay has called the emergency services.

And then I get SMS messages on all my Apple devices …..

And then the home phone rings again, with yet another recorded message with my location ( thanks to Apple Maps ).

In short, the Apple ecosystem has kicked in to save me … even though there's nothing wrong with me, apart from my obvious inability to use Apple hardware.

Finally, I manage to power the watch off, set it on its charging stand, so it can reboot - and all seems well.

For the record, this is what I should've done: -

data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAkGBwgHBgkIBwgKCgkLDRYPDQwMDRsUFRAWIB0iIiAdHx8kKDQsJCYxJx8fLT0tMTU3Ojo6Iys/RD84QzQ5OjcBCgoKDQwNGg8PGjclHyU3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3N//AABEIAKAAzwMBIgACEQEDEQH/xAAcAAABBAMBAAAAAAAAAAAAAAAABAUGBwECAwj/xABREAABAgQDAwYHCBAEBwEAAAABAgMABAURBhIhEzFBB1FhcZGSFCIyU4Gh0RUjMzVCUmOyFiUmREZUVVZicnOClLGzwnSiwfA0Q0WEk9LhJP/EABoBAQADAQEBAAAAAAAAAAAAAAABAgMEBQb/xAAiEQEAAgIBAwUBAAAAAAAAAAAAAQIDESEEEmEFEyIx8EH/2gAMAwEAAhEDEQA/ALxggggCIrjXHVIwjLFU45tZkjxJds+MqFOOcRIw1QXpzKVvqs2w2N61nQDtijqvT3aar3XrxE3iCbG1CHPGblUnydOJ5h6YCwpWs4rr8s3OvTktQpN8ZmWgAp5SeBud3UAY7ClPuav4vqbh50KIHqTFW0nHjkqnZVHOtwffCU3J/W49nZDyMcsrHiTzZvwzawE8FFFtcTVc/vq9kZFFb/OSr95Xsiv140I3Tye/HJWNFX/4wd+AsX3Fb/OKr9qvZB7itfnDWD6V+yK5+zRf44O/Gfs0X+NjvwFi+4rP5wVjtX7IDRmR+EFZ7V+yK7+zRf42O/8A/YwcarH32O/AWCqkND8Iqx2q9kcl01sa/ZHWO1XsiAKxqs/fY78cV4yWfvod6An6pJKfwmq4/fV7I4OMqRqnFdVHWtXsivnMXOH76v8AvQjdxa7r/wDo/wA0BZrM5WZdQFPxcXl38VubSFA9G4GFlJ5UvAqx7iYulkyc1pkmWzdtYO49F4pWZxS4oEFSnL8OeNqVX2qi/wC5+ImvCZBzRCv+ZLdKDzdEB61ZdQ82lxtQWhQuFDcRG8VDyc1iewzWkYWq75mJGaGemzd7hQIvlueiLdG6AzBBBAEEEEAQQQQBAYIDAQDEjIrHKFS5F43lKeyqbcTwzbk37VH0RV2OZxc3PzMws3Liyr0cOwWHoi10JvjPEMwf+XJJQOxUU5ikErV1QEGmjqTCImFszxhzo9Ep71JXU6xPuSksqZ8Fa2TW0UV5cxURceKARe2pvAR+CHxWGJ00p6qS7ktMSrSiCWXgVZc2ULynUAm2h11va0LJjAtZYmUMFEqtRecZWpuaQpLK2xdYcN/FsNYCLRm8SaWwPWJoPGXTKupQcrakTSCHzlzWbsdfFIPDm36QgrlIbpqpANOKX4VIszKs3yVLBJA6NIBovGIUCX03Rqpm0RscoxHUIjYMGGxwjMdFtFO+OcSC8bNqKXEqG8GNIIC8aK2azyd7VPjT1HdD8qriAPHAvzXCh1RdNMmRN0+WmRudaSvtEU3ySHPh+ptHUFlGnei08GKzYWph+gSOyAeoIIIAggggCCCCAIIIICEs/H2J1fQgf5VRTmKk2cVFyM/HGKT9GPqmKexYPfFwEAmt5hfSK+JCRXITVOlKhKl8TKG5grAQ6Ba/iqFwRvSd9oQTfGEXGAmKcZTkxSHKauUk0ocaLKlt50+LnzizYOQG99bbrQoZxROodmnBLyqhNzz8482sKKV7VOVaNCDltu49PPDmFEGFqHriMbzbfCsn9vFb8nLOystTpRmWKitplpx5GyVlCSSQsFdwBfNfXXTcWSeqLtSVKF5CEGWlW5ZGU+UlAIBN+OscXbq3RyykHdF67mOUReCkAZeEcHhGQvhGQkucIaXiP64spuYdGWEZTCRDBRwjsHigRE7Z325TbaRuhrWLGF0w8VQhXe94mqatIIIIuuvPkhWE0eoKUQlIlwVE8BrrE6wpialSdHp1PmZxhuayBCmtu2ooJ3ZgFXHpGkVFQZl2VwDVwyopXMNtS4I32Wux9VxDy3yXeAU33S2i1PhJc0SSbgE/zgL5EZjVpWdtKvnAGNoAggggCCCCAIDBBAQpn42xV+zH1TFP4t+EXFwM/GmKj+h/aYp/Fvwi4Cv5vUmOUnKPTkwhiWQp11xWVCEi5UeiO01oox6A5DsKy9Lw8muzCEqnZy6kKI+Db3C3XYmArSU5IcWzDAd8EQ1pfI4sBXZDRPYQqdLmvBqihUs9a4Q4g+MOcHiOqPUCaql9ynllPvU0tYusWJSEkgjrt2QnqkhI4roz0q+3uUpKSpPjMuDS/X/pFY1MpvS2nmpmguEavIP7pjZygLsbPI7piUUuXSaqJScsMji23QVWAKb316xDiZSnuMreaQpxrKtTjiVkBopbBsL6kFVwLi9rcY025NK8NBczfDI7phfL0Fdh76jsMTUUemB9hG1LhXMttKs5pZasyT/47ekxu1JSglJtS0JbdbzlCLrB8VCToFDnOt+F7boh0U4hD1UFfnUd0w3zVCd4PI7piVKcEIZhwExEwvPKMfY+8T8OjumENQpMxKJK1AKQN6k8ImKCI5TFsukVRpXx3xiF1Wlky82oIFkKGZPRCGLC0sNqQnCEwtwXSl2Xv3lRduIFoVh9ZSLp2avQMp6Dfq0iocDUpVawVXJFrR5UmlbJv8tKrp9Yt6YTUubxXNJYk35mXmG3yltjK6FE5hbdvHZAehZTSVZH6Cf5R2jCQAkAbhGYAggggCCCCAIIIICFs/GeKv1R9QxT2LfhFxcTPxhirqH1DFPYu+FXAV/NGyyY9Ocl04zWOTmQaZXq0wZVy29KgLesEH0x5jmvKiQ4Cx1P4NnFKlvfpR221l1k2V0jmMExOno+VnmJVmWZqcq8mblE5U7NlSgo5ct0W3giFtJQ5LsTc5OJDG2cLykKPwaQANemwuYg0py2YXdlgt9E607bVGzB167xEsacqor0uZCmqTJSS9HSpQLjo5jwA5xx/nWK8r2yRaNRGtmovprOIpt1Ky21MzLzwIFyElRXp02hailsOSTcyiYd2UxsyyklAIChcXBOv7u8a9UQYqUu04lxqbQhaTcKC7EekQ5tVZEwsren1LUVZ7qdJObQX67AdkXY9sF9Ukm5SdUwypSko0JUpKtbn5v8t8J7lAvcxuZyWWSozKCTqSVakxxfm5QJ+Hb7YJjhhcz0mErj1zvhM/PydyPCWu9HDw6U/GG+2CTgh63GBx3NpDeJ+U/GG+2E03WGW0WYVtF9WgiAir7gVNJQPko19MNUbuFTiytRuVG5MPchQD4K3UKzMpp1PcF21LGZ2YH0Te9X6xsnp4QFt8jBtTKh/hh/MxZWCA0MLU1WVAOxGthFYcl9TkUT8nT6XJrEtNKLTzk2vM4tIbWoaJsBqOnSLBpc4pjFj1IQWmaezJMqal0tpSlKlLdBtpfclOnRASwKHCNo5lpB4W6tI1KVo1SSocxgO0EcmnQvfvjrAEEEEAQQQQENZ+MMVdX9hincXfCri4WD9scVDo/sMU9i34RcBX81vMSrk35PZvGMyt1ayxT2SA49a9z80c5/3zRFZreY9SclUmzJYApAl0gbZnbLtxUrU3/l6IBvY5IMGMshtyRfeX5xUytJvz2SQPVERxxyWNUmTcqNCCn5dsZnmHEgrQnnB4gc2+JBVsTTrNYly8qXmFSanBtJdRS2vNbRW/VNonlFdenKW05OPS0wtwHMuXHvZB4DU9UZ0yxeZiHZ1HRZMGOt7fVnmFiUQ84EMy4cWdyUouT6Ic5CmuPISpmQW4kpzBSGbi17X3bo3eWKFimbMuMyZObeQ0BzJUpI9VoWjETTkuph2TLbRRkQhvKoIAVceWLc+vOTGrjaGnrbbQpcitKVpKkKLNgoAZiQbbraxyfpzt2kGQcCnvggWdV9Wmsdk11lts7GXczONJQ6VLBF0sFkZQB0316owqutIm3JxiWWZhaVGzrpKErIAJAFjuvre+o5tQYn5dvKlZYSErF0Eo8oXI09IMcNgz5pHdEO1TnpWdbRkZeZdQV5U5gW7KcUs9O9RENsBz2DPmkd0QkmqWw+k7NIac4KSNPTC+O8hKLn56Xk21ZVPOBOb5o4n0C59EQGqmU6XpdPVWqw0h7x1NyUkvdMOJ0Kl/Rp4j5R054aKlUpuqzjk3PvKefc0KjoAOAAG4DgBpC3FFVFTq7ipfSSlx4PKI4JZTonrJ3k8SSYZ+MBbPJJ8eUn9sv+i5E1xE06jGzc54Ol9gJlm7KUQEqUt4ZtN9s17G4iFcknx7SR9Mv+i5FpVJCXZ+bChcpLCx6FH2wE0EZgggOL7VwVo0UPXGJd0LTrvjuYQve8TII0S5r6YBdBGEm4BjMAQQQGAhUuftlirq/sMU/i74RcW+x8b4q/U/sioMXeWuAgE1vMXtyG4ylZukJw9PPIRNy6j4PmNg4g65esG/otzGKIm95jhLB0vJ8HzbS+mXfAexHKFJmckphpIZTK7QhptACV57Xv2QjxLXKdg+iqcs0hZCvB5ZOhcX0DmvvMUTTsRcoTMoG2KlM7O1hnIUbdZBMMc6/WpmZW/POByYVvW8pRUejXhGdLY5n4y3yRmisd8TrycpCbDVTM3NeNnK1uEDylKBue0wurlRYnGAhp1bytolacyLbFIbCSgHrF+aIxeo80v64z9seaX9casCyA674R/bHmY9cYvUTwl/XALYIR/bHml/XGPtjzS/rgFsLaS8ZcVOZBIXLUx9xCh8lRytg9rghm+2O+0v64V0hM47L19qZCRtKS5kybvFdacPqQYCIQDeIIIgW1ySfHtJ/bL/ouRaM2q1WnB9E39aKt5JPj2k/t1/0XIs2dVavzI52kfWEBOoIIIAhHUxaXSr5qx7IWQiqyrSmX5y0j13/0gO0qq6I7wmkh72IUjdAEYO6MxgwEKY+OMVD6P+yKgxbqtUXAwPt3ikfRD6hin8WeUrqgIBNbzfdE7wHh5S2EOoZDsy6CoX+SBxud0QSa8oxbWBZxkS0usvNtoUwps7RvOhWnkrA4Hnjh660xSI3qJnl6npVInLa2tzEbiC4UioEPESrlmCQ5zg2v6dNdIRVaivrkkOzLBQh34J08Da47REtFUpSX2HGZlaG5F51YaUFK2wUkCySem414Q2VWak36UwnwgOzacgzJSpJyhNjtBuJB3GPK9vHj3bHbmPL3/fy5tY8tPjPif3lXdIk0ztUYk3syUrUQrLvFgTpoebmh2lcOMzWyXt1tN7ZSHhcKKUADKQbC91EJ3cRDTKoM5VwG5gS5cdUUuXy5RqdNRrwha/SZhLitpOuF87QspcbUlTmRAWrMb+IbaDyrkb7WMfRVndY2+OvERaYh3aoMstbSVFwFUmmYUS+lNypLarXKLJ1XzncIHcPshDmRUyuzivfTlSlID2zsQdSrjcc6dLQgq8o/IpZQqdcmGlpOQ6hFhbyTcgp3brdQhCX3ikp2rmUnMRnNiee3PFlUiRhph6ouybTriFpmmmUBSwS4FKVnsco8bKm4Ft4hGmlSx8DbWmczzIbs+ACzmcuEjdewNr6njutDUuZfWoKW+8pQIIUpxRItu1J4axrtXciUbVzIg3QnObJPOBwgNpxtpqadaYWpaEKyhagAVW0J6ib26I70aZZlKow5NW8GXmaf/ZrBSq/QAb+iEZ1JJ4xgi4O6AjdWkHaXU5qQfBDku6ps9Njv9Oh9MJImNUlk4gk0iWSVVqnsgOIGqpqXSLBQ51oFgRvKbHgYhw39EQLZ5JPj6k/t1/0XIsqd1xDMD6NH1hFa8knx5Sv2y/6LkTWs1qXkcc+APXC5htkN2F7lSz/6QFpQQQQAYaqq5nmGmE8PGV/pC6bmW5ZlTjh6hxJ5oa5Btx50vu6rWbmAdZZOVAEdo1SLCNoAgMEYMBDWE/dJiZsbywk260GKbxUrU9UXJPOJp3KC2HtGanKZQTuKkbx2H1RU2OZFyUnpmXUNWllPo4eq0BW8z5RMOWGsQqpSi08CthRvb5sNs4CFqhEd8UvSuSvbb6aYct8N4vSdTC0msRUpxGbwkAHhb2QhqNeafbLcs4EoVoVFQBPVFd6QRyY/T8NLdz0c3rHVZadkzpMWJwS7qXWX0JWncbg+o7/TCg1uZKHEGeGV0WUAECwyhNk6eKMoAsmwsLbogkEdzyk3m6o5OZPCZpLmS9vJTv3nS1ybDU6wn27XnEd4REIICX7drziO8INu15xHeERCCAl+2a86gfvCEs3VJeXQQhQcc4BO70xGoIBS1PTLE6icl3ltTCF50OINik9ESAvUjEQC51xuk1dR8d/KfBplXOoJB2ajxIGW+thEWhRJsqfmW20gm5EBcfJlSJ+n1ylqmpYhsurKXUKDjahsl6haSUntiUrojNc5SXlOPuMOyLUs+hbdsx1dBSb8CCRDFSpSYoXJ09MSzrkvOT7uzlSgkG5shKrfrFR6osiTkpxllsqecU/s0hxw6qUbcTASO9tTCGbqjEv4qTtXPmI17TCBUlMO22zzqhzFRMKZamIR8mASobfnnttM8PJQNyYeJdrZoGkbNMpbGkdYAggggCCCCAjuN6E7WqUkyS9nUJRYflV/pjgegi4iBTrbOMpNQUjwauy42cxLOeKVEf70PERbxiMYowdKVp4Tss6qRqiB4s0yNSOZQ4iA89YgwxOSj7iFsOIUDqlSbGIzMUyZQo+9q7I9FLbxVT0GXq1GYrEunQPS5FyP1VbvQYTqco7ljN4OqDa+OWTWf5QHnNUq+D8ErsjHg73ml92PRf3L/Kw1UAf8G77Ix9yg/B6f/g3vZAedPB3vNOd0wbB7zS+6Y9FXwn+QZ/8AhHvZGb4T/IVQ/hHvZAedNg75pfdMGwd80vumPRf3Jj/oNQ/hXvZB9yf5BqH8K97IDzpsHvNOd0xnwd/zLndMeirYT/INR/hXvZGB9if5v1D+Fe9kB528He8y53TGyZOZV5Mu6f3DHogfYtww7UT/ANm97I3ScNA+Lhmok83gb3sgPPsvRqg+fFlnAniSLARZOAuT5xWWo1UmWkkeMt5emYcyL7+vd2RYTD6AR7i4MmlODyVPsbMJ6buQtawtV8QPJdxVMttSY19z5RRIV+uvS46AIBNQmTi6vMTyGdlQKSQJMcH3Bpcfop4HifRFhZRe5EaS0u1KsoZl20ttNpyoQkWCRzCOsBjKOaACMwQBBBBAEEEEB//Z

i.e. hit the FIRST rather than the THIRD control.

An update - the landline rang again today, 12 hours later, to tell my beloved that my location had changed - I wonder how much longer it's going to do that ……

IBM Cloud Private - My first foray

So this week, along with many other things, I'm starting to get to grips with the newly announced IBM Cloud Private: -

IBM brings the power of cloud behind the enterprise firewall

I'm running on Ubuntu Linux: -

lsb_release -a

No LSB modules are available.
Distributor ID:    Ubuntu
Description:    Ubuntu 16.04.3 LTS
Release:    16.04
Codename:    xenial


so started by installing the pre-requisites of VirtualBox and Vagrant: -

sudo apt-get install virtualbox
sudo apt-get install vagrant

and, having cloned the Git repository: -

https://github.com/IBM/deploy-ibm-cloud-private

I followed the instructions to bring up the Vagrant environment: -

vagrant up

Bringing machine 'icp' up with 'virtualbox' provider...
==> icp: Clearing any previously set forwarded ports...
==> icp: Clearing any previously set network interfaces...
==> icp: Preparing network interfaces based on configuration...
    icp: Adapter 1: nat
    icp: Adapter 2: hostonly
==> icp: Forwarding ports...
    icp: 22 (guest) => 2222 (host) (adapter 1)
==> icp: Running 'pre-boot' VM customizations...
A customization command failed:

["modifyvm", :id, "--apic", "on"]

The following error was experienced:

#<Vagrant::Errors::VBoxManageError: There was an error while executing `VBoxManage`, a CLI used by Vagrant
for controlling VirtualBox. The command and stderr is shown below.

Command: ["modifyvm", "6386ef56-d015-4672-919d-40758eeab63c", "--apic", "on"]

Stderr: Oracle VM VirtualBox Command Line Management Interface Version 5.0.40_Ubuntu
(C) 2005-2017 Oracle Corporation
All rights reserved.

Usage:

VBoxManage modifyvm         <uuid|vmname>
                            [--name <name>]
                            [--groups <group>, ...]
                            [--description <desc>]
                            [--ostype <ostype>]
                            [--iconfile <filename>]
                            [--memory <memorysize in MB>]
                            [--pagefusion on|off]
                            [--vram <vramsize in MB>]
                            [--acpi on|off]
                            [--pciattach 03:04.0]
                            [--pciattach 03:04.0@02:01.0]
                            [--pcidetach 03:04.0]
                            [--ioapic on|off]
                            [--hpet on|off]
                            [--triplefaultreset on|off]
                            [--paravirtprovider none|default|legacy|minimal|
                                                hyperv|kvm]
                            [--hwvirtex on|off]
                            [--nestedpaging on|off]
                            [--largepages on|off]
                            [--vtxvpid on|off]
                            [--vtxux on|off]
                            [--pae on|off]
                            [--longmode on|off]
                            [--cpuid-portability-level <0..3>
                            [--cpuidset <leaf> <eax> <ebx> <ecx> <edx>]
                            [--cpuidremove <leaf>]
                            [--cpuidremoveall]
                            [--hardwareuuid <uuid>]
                            [--cpus <number>]
                            [--cpuhotplug on|off]
                            [--plugcpu <id>]
                            [--unplugcpu <id>]
                            [--cpuexecutioncap <1-100>]
                            [--rtcuseutc on|off]
                            [--graphicscontroller none|vboxvga|vmsvga]
                            [--monitorcount <number>]
                            [--accelerate3d on|off]
                            [--accelerate2dvideo on|off]
                            [--firmware bios|efi|efi32|efi64]
                            [--chipset ich9|piix3]
                            [--bioslogofadein on|off]
                            [--bioslogofadeout on|off]
                            [--bioslogodisplaytime <msec>]
                            [--bioslogoimagepath <imagepath>]
                            [--biosbootmenu disabled|menuonly|messageandmenu]
                            [--biossystemtimeoffset <msec>]
                            [--biospxedebug on|off]
                            [--boot<1-4> none|floppy|dvd|disk|net>]
                            [--nic<1-N> none|null|nat|bridged|intnet|hostonly|
                                        generic|natnetwork]
                            [--nictype<1-N> Am79C970A|Am79C973|
                                            82540EM|82543GC|82545EM|
                                            virtio]
                            [--cableconnected<1-N> on|off]
                            [--nictrace<1-N> on|off]
                            [--nictracefile<1-N> <filename>]
                            [--nicproperty<1-N> name=[value]]
                            [--nicspeed<1-N> <kbps>]
                            [--nicbootprio<1-N> <priority>]
                            [--nicpromisc<1-N> deny|allow-vms|allow-all]
                            [--nicbandwidthgroup<1-N> none|<name>]
                            [--bridgeadapter<1-N> none|<devicename>]
                            [--hostonlyadapter<1-N> none|<devicename>]
                            [--intnet<1-N> <network name>]
                            [--nat-network<1-N> <network name>]
                            [--nicgenericdrv<1-N> <driver>
                            [--natnet<1-N> <network>|default]
                            [--natsettings<1-N> [<mtu>],[<socksnd>],
                                                [<sockrcv>],[<tcpsnd>],
                                                [<tcprcv>]]
                            [--natpf<1-N> [<rulename>],tcp|udp,[<hostip>],
                                          <hostport>,[<guestip>],<guestport>]
                            [--natpf<1-N> delete <rulename>]
                            [--nattftpprefix<1-N> <prefix>]
                            [--nattftpfile<1-N> <file>]
                            [--nattftpserver<1-N> <ip>]
                            [--natbindip<1-N> <ip>
                            [--natdnspassdomain<1-N> on|off]
                            [--natdnsproxy<1-N> on|off]
                            [--natdnshostresolver<1-N> on|off]
                            [--nataliasmode<1-N> default|[log],[proxyonly],
                                                         [sameports]]
                            [--macaddress<1-N> auto|<mac>]
                            [--mouse ps2|usb|usbtablet|usbmultitouch]
                            [--keyboard ps2|usb
                            [--uart<1-N> off|<I/O base> <IRQ>]
                            [--uartmode<1-N> disconnected|
                                             server <pipe>|
                                             client <pipe>|
                                             tcpserver <port>|
                                             tcpclient <hostname:port>|
                                             file <file>|
                                             <devicename>]
                            [--lpt<1-N> off|<I/O base> <IRQ>]
                            [--lptmode<1-N> <devicename>]
                            [--guestmemoryballoon <balloonsize in MB>]
                            [--audio none|null|oss|alsa|pulse]
                            [--audiocontroller ac97|hda|sb16]
                            [--audiocodec stac9700|ad1980|stac9221|sb16]
                            [--clipboard disabled|hosttoguest|guesttohost|
                                         bidirectional]
                            [--draganddrop disabled|hosttoguest]
                            [--vrde on|off]
                            [--vrdeextpack default|<name>
                            [--vrdeproperty <name=[value]>]
                            [--vrdeport <hostport>]
                            [--vrdeaddress <hostip>]
                            [--vrdeauthtype null|external|guest]
                            [--vrdeauthlibrary default|<name>
                            [--vrdemulticon on|off]
                            [--vrdereusecon on|off]
                            [--vrdevideochannel on|off]
                            [--vrdevideochannelquality <percent>]
                            [--usb on|off]
                            [--usbehci on|off]
                            [--usbxhci on|off]
                            [--usbrename <oldname> <newname>]
                            [--snapshotfolder default|<path>]
                            [--teleporter on|off]
                            [--teleporterport <port>]
                            [--teleporteraddress <address|empty>
                            [--teleporterpassword <password>]
                            [--teleporterpasswordfile <file>|stdin]
                            [--tracing-enabled on|off]
                            [--tracing-config <config-string>]
                            [--tracing-allow-vm-access on|off]
                            [--usbcardreader on|off]
                            [--autostart-enabled on|off]
                            [--autostart-delay <seconds>]
                            [--videocap on|off]
                            [--videocapscreens all|<screen ID> [<screen ID> ...]]
                            [--videocapfile <filename>]
                            [--videocapres <width> <height>]
                            [--videocaprate <rate>]
                            [--videocapfps <fps>]
                            [--videocapmaxtime <ms>]
                            [--videocapmaxsize <MB>]
                            [--videocapopts <key=value> [<key=value> ...]]
                            [--defaultfrontend default|<name>]

VBoxManage: error: Unknown option: --apic
>

Please fix this customization and try again.


Suspecting that I'd got the wrong versions of the pre-requisites, I checked what I'd installed: -

vagrant -v

Vagrant 1.8.1

VBoxManage -version

5.0.40_Ubuntur115130

whereas the above Git page specifies: -

Vagrant 2.0.0

VirtualBox 5.1.28

I downloaded the latest versions of both: -

https://www.hashicorp.com/blog/hashicorp-vagrant-2-0/

https://www.virtualbox.org/wiki/Linux_Downloads

and started by installing the new version of Vagrant, and retrying the ICP installation: -

vagrant up

Bringing machine 'icp' up with 'virtualbox' provider...
==> icp: Clearing any previously set forwarded ports...
==> icp: Clearing any previously set network interfaces...
==> icp: Preparing network interfaces based on configuration...
    icp: Adapter 1: nat
    icp: Adapter 2: hostonly
==> icp: Forwarding ports...
    icp: 22 (guest) => 2222 (host) (adapter 1)
==> icp: Running 'pre-boot' VM customizations...
==> icp: Booting VM...
There was an error while executing `VBoxManage`, a CLI used by Vagrant
for controlling VirtualBox. The command and stderr is shown below.

Command: ["startvm", "6386ef56-d015-4672-919d-40758eeab63c", "--type", "headless"]

Stderr: VBoxManage: error: The virtual machine 'IBM-Cloud-Private-dev-edition' has terminated unexpectedly during startup with exit code 1 (0x1)
VBoxManage: error: Details: code NS_ERROR_FAILURE (0x80004005), component MachineWrap, interface IMachine

Assuming that the problem was more with VirtualBox than Vagrant, I installed the new version of that ( which took a bit of work with sudo dpkg --remove and sudo dpkg --purge).

Having validate the versions: -

vagrant -v

Vagrant 2.0.0

VBoxManage -v

5.1.28r117968

This time around: -

vagrant up

Bringing machine 'icp' up with 'virtualbox' provider...
==> icp: Clearing any previously set forwarded ports...
==> icp: Clearing any previously set network interfaces...
==> icp: Preparing network interfaces based on configuration...
    icp: Adapter 1: nat
    icp: Adapter 2: hostonly
==> icp: Forwarding ports...
    icp: 22 (guest) => 2222 (host) (adapter 1)
==> icp: Running 'pre-boot' VM customizations...
==> icp: Booting VM...
==> icp: Waiting for machine to boot. This may take a few minutes...
    icp: SSH address: 127.0.0.1:2222
    icp: SSH username: vagrant
    icp: SSH auth method: private key
==> icp: Machine booted and ready!
==> icp: Checking for guest additions in VM...
==> icp: Setting hostname...
==> icp: Running provisioner: shell...
    icp: Running: script: configure_master_ssh_keys
==> icp: Running provisioner: shell...
    icp: Running: script: configure_swap_space
==> icp: Setting up swapspace version 1, size = 8 GiB (8589930496 bytes)
==> icp: no label, UUID=d5e47d79-2646-4bf8-b89d-45b60ca406ff
==> icp: vm.swappiness = 60
==> icp: vm.vfs_cache_pressure = 10
==> icp: Running provisioner: shell...
    icp: Running: script: configure_performance_settings
==> icp: vm.swappiness = 60
==> icp: vm.vfs_cache_pressure = 10
==> icp: net.ipv4.ip_forward = 1

...

==> icp: Starting cfc-worker2
==> icp: Running provisioner: shell...
    icp: Running: script: wait_for_worker_nodes_to_boot
==> icp:
==> icp: Preparing nodes for IBM Cloud Private community edition cluster installation.
==> icp: This process will take approximately 10-20 minutes depending on network speeds.
==> icp: Take a break and go grab a cup of coffee, we'll keep working on this while you're away ;-)
==> icp: .
==> icp: .
==> icp: .
==> icp: master.icp             ready
==> icp: cfc-worker1.icp         ready
==> icp: cfc-worker2.icp         ready
==> icp: cfc-manager1.icp         ready
==> icp: Running provisioner: shell...
    icp: Running: script: precache_images
==> icp:
==> icp: Seeding IBM Cloud Private installation by pre-caching required docker images.
==> icp: This may take a few minutes depending on your connection speed and reliability.
==> icp: Pre-caching docker images....
==> icp: Pulling ibmcom/icp-inception:2.1.0-beta-3...
==> icp: Pulling ibmcom/icp-datastore:2.1.0-beta-3...
 ==> icp: Pulling ibmcom/icp-platform-auth:2.1.0-beta-3...
 ==> icp: Pulling ibmcom/icp-auth:2.1.0-beta-3...

...

So it hasn't yet finished, but, in the words of Tom Cruise, "It's looking good so far"

:-)

Ubuntu - Software Updater and the Insufficient Disk Space

So I'm trying to update Ubuntu 16.0.4.3 LTS using Software Updater, but couldn't get past this: -

    

Now I have LOADS of disk space: -

df -kmh

Filesystem                   Size  Used Avail Use% Mounted on
udev                          16G     0   16G   0% /dev
tmpfs                        3.2G  9.4M  3.2G   1% /run
/dev/mapper/ubuntu--vg-root  2.7T  346G  2.2T  14% /
tmpfs                         16G  224K   16G   1% /dev/shm
tmpfs                        5.0M  4.0K  5.0M   1% /run/lock
tmpfs                         16G     0   16G   0% /sys/fs/cgroup
/dev/loop1                    81M   81M     0 100% /snap/core/2381
/dev/loop0                    89M   89M     0 100% /snap/conjure-up/527
/dev/loop2                    80M   80M     0 100% /snap/conjure-up/745
/dev/loop3                    81M   81M     0 100% /snap/core/2462
/dev/loop4                    89M   89M     0 100% /snap/conjure-up/549
/dev/loop5                    82M   82M     0 100% /snap/core/2898
/dev/sda2                    473M  363M   86M  81% /boot
/dev/sda1                    511M  3.4M  508M   1% /boot/efi
tmpfs                        3.2G   76K  3.2G   1% /run/user/1000

and yet /boot is 81% full.

A quick Google brought me here: -

Not enough free disk space when upgrading

which had me do this: -

sudo apt-get autoremove

and now I have this: -

df -kmh

Filesystem                   Size  Used Avail Use% Mounted on
udev                          16G     0   16G   0% /dev
tmpfs                        3.2G  9.4M  3.2G   1% /run
/dev/mapper/ubuntu--vg-root  2.7T  344G  2.2T  14% /
tmpfs                         16G   52M   16G   1% /dev/shm
tmpfs                        5.0M  4.0K  5.0M   1% /run/lock
tmpfs                         16G     0   16G   0% /sys/fs/cgroup
/dev/loop0                    89M   89M     0 100% /snap/conjure-up/527
/dev/loop2                    80M   80M     0 100% /snap/conjure-up/745
/dev/loop3                    81M   81M     0 100% /snap/core/2462
/dev/loop4                    89M   89M     0 100% /snap/conjure-up/549
/dev/loop5                    82M   82M     0 100% /snap/core/2898
/dev/sda2                    473M  132M  317M  30% /boot
/dev/sda1                    511M  3.4M  508M   1% /boot/efi
tmpfs                        3.2G   80K  3.2G   1% /run/user/1000
/dev/loop6                    84M   84M     0 100% /snap/core/3017

and now I have this: -

which is nice :-)

Friday, 13 October 2017

Git and Jenkins - Learning Resources

This is what I've been reading and using over the past few days: -

The Pro Git Book













And now for Maven …..

Jenkins to Git - SSH says "No"

As per my earlier post: -


I'm on a voyage of discovery with Jenkins and Git.

Whilst trying to plumb onto into t'other, I was hitting a blocker.

To recap, I have Jenkins installed on my MacBook, running locally, and I have Git running on a Docker container on a remote Mac.

Therefore, I'm connecting to the remote Git repository using SSH rather than, say, HTTPS or a local file-system.

This works OK for me using Git commands such as: -

and: -

git push

So, in the world of Jenkins, I thought it'd be equally simple.

To start with, I created a new job / project: -


chose Git as my SCM: -


added in the SSH URL: -


and immediately saw this: -

Failed to connect to repository : Command "git ls-remote -h ssh://git@192.168.1.214:2222/git-server/repos/myrepo.git HEAD" returned status code 128:
stdout: 
stderr: Permission denied (publickey,keyboard-interactive). 
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.

before I'd had a chance to enter some credentials :-(

I clicked the button to add new creds: -


which led me here: -


I tried pasting the private key into the Key field, having used the command: -

ssh-keygen -y -f ~/.ssh/id_rsa

to retrieve the hash that represents that private key.

I'd previously validated that the retrieved hash matches the public key: -

~/.ssh/id_rsa.pub

However, when i switched back to the project configuration screen, I saw this: -

Failed to connect to repository : Command "git ls-remote -h ssh://git@192.168.1.214:2222/git-server/repos/myrepo.git HEAD" returned status code 128:
stdout: 
stderr: Load key "/Users/Shared/Jenkins/tmp/ssh6857222762876740778.key": invalid format 
Permission denied (publickey,keyboard-interactive). 
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.

After much faffing about, I switched the Credentials to this: -


i.e. specifically pulling the key from ~/.ssh on the main Mac, upon which Jenkins is running ( hence the Jenkins master ).

But I was still seeing this: -

Failed to connect to repository : Command "git ls-remote -h ssh://git@192.168.1.214:2222/git-server/repos/myrepo.git HEAD" returned status code 128:
stdout: 
stderr: Load key "/Users/Shared/Jenkins/tmp/ssh6850003580465807718.key": invalid format 
Permission denied (publickey,keyboard-interactive). 
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.

Then I realised that it was looking for the home directory of A DIFFERENT USER :-)

There's even a clue in the error above: -

stderr: Load key "/Users/Shared/Jenkins/tmp/ssh6850003580465807718.key": invalid format 

So I switched to root: -

su -

Changed to the appropriate ~/.ssh directory for the Jenkins user: -

cd /Users/Shared/Jenkins/.ssh

and copied the private key: -

cp /Users/davidhay/.ssh/id_rsa .

and tried again.

Alas: -

Failed to connect to repository : Command "git ls-remote -h ssh://git@192.168.1.214:2222/git-server/repos/myrepo.git HEAD" returned status code 128:
stdout: 
stderr: Load key "/Users/Shared/Jenkins/tmp/ssh2229777690807748085.key": invalid format 
Permission denied (publickey,keyboard-interactive). 
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.

I validated that the key was in the right place: -

ls -al /Users/Shared/Jenkins/.ssh

total 16
drwx------  4 jenkins  jenkins   128 13 Oct 11:14 .
drwxr-xr-x  7 jenkins  jenkins   224 12 Oct 10:34 ..
-rw-------  1 root     jenkins  1766 13 Oct 11:14 id_rsa
-rw-r--r--  1 jenkins  jenkins   363 13 Oct 10:15 known_hosts


and then noticed the obvious mistake.

Can you see where I went wrong ?

-rw-------  1 root     jenkins  1766 13 Oct 11:14 id_rsa

I changed the permissions: -

chown jenkins:jenkins /Users/Shared/Jenkins/.ssh/id_rsa

validated the change: -

ls -al /Users/Shared/Jenkins/.ssh

total 16
drwx------  4 jenkins  jenkins   128 13 Oct 11:14 .
drwxr-xr-x  7 jenkins  jenkins   224 12 Oct 10:34 ..
-rw-------  1 jenkins  jenkins  1766 13 Oct 11:14 id_rsa
-rw-r--r--  1 jenkins  jenkins   363 13 Oct 10:15 known_hosts

and retried Jenkins: -


To finish, I added a new Build step: -


which compiles and executes the Java sourced from Git, and then ran the Build: -


Whilst I was on the Jenkins master, I also checked the workspace: -

ls -al /Users/Shared/Jenkins/Home/workspace/DaveHay

total 24
drwxr-xr-x   6 jenkins  jenkins  192 13 Oct 10:21 .
drwxr-xr-x   6 jenkins  jenkins  192 12 Oct 13:59 ..
drwxr-xr-x  13 jenkins  jenkins  416 13 Oct 11:26 .git
-rw-r--r--   1 jenkins  jenkins  462 13 Oct 11:26 HelloWorld.class
-rw-r--r--   1 jenkins  jenkins  148 13 Oct 10:21 HelloWorld.java
-rw-r--r--   1 jenkins  jenkins   25 13 Oct 10:18 Readme


which showed the newly compiled Java class.

Job done :-)