Category Archives: Docker

IBM Connect 2017 – slides, news and so

This year I attended IBM Connect in San Francisco. In my eyes it was a great event and I enjoyed it very much.

Some announcements are very important for the future and evolution of the IBM portfolio:

  • IBM Connections Pink – Jason Gary and the IBM Development team showed the future of IBM Connections. The basis will be Docker and a lot of other Opensource products. I see forward to work with a complete new stack and be very curious on deployment, migration and scaling. It is a complete rewrite and will not longer need DB2 or WebSphere. A good summarize was written by Glenn Kline.
  • panagenda ApplicationInsight – all IBM Domino customers with valid maintenance will get ApplicationInsights to analyze the code and usage of their Domino databases
  • IBM Domino will be updated through feature packs, we will get Java 8 and other long awaited functionality
  • IBM announced a new lifetime IBM Champion: Julian Robichaux, big congrats to him and well deserved

Just a few session slides are available through the official conference page (we provided them, but they are still not available), so we uploaded ours to slideshare:

Best and Worst Practices for Deploying IBM Connections


IBM Connections Adminblast


All other session slides of my panagenda colleagues can be found in the panagenda slideshare account.

Update

During the 11 hour flight to San Francisco I used the time to update the XPages and Generic HTML Widgets (OpenNTF) for IBM Connections 5.5 CR2. Frank van der Linden uploaded the changes today.

Better logstash filter to analyze SystemOut.log and some more

Last week I wrote a post about Using Docker and ELK to Analyze WebSphere Application Server SystemOut.log, but i wasn’t happy with my date filter and how the websphere response code is analyzed. The main problem was, that the WAS response code is not always on the beginning of a log message, or do not end with “:” all the time.

I replaced the used filter (formerly 4 lines with match) with following code:

grok {
        # was_shortname need to be regex, because numbers and $ can be in the word
        match => ["message", "\[%{DATA:wastimestamp} %{WORD:tz}\] %{BASE16NUM:was_threadID} (?<was_shortname>\b[A-Za-z0-9\$]{2,}\b) %{SPACE}%{WORD:was_loglevel}%{SPACE} %{GREEDYDATA:message}"]
        overwrite => [ "message" ]
        #tag_on_failure => [ ]
    }
grok {
        # Extract the WebSphere Response Code
        match => ["message", "(?<was_responsecode>[A-Z0-9]{9,10})[:,\s\s]"]
        tag_on_failure => [ ]
    }

(more…)

Using Docker and ELK to Analyze WebSphere Application Server SystemOut.log

I often get SystemOut.log files from customers or friends to help them analyzing a problem. Often it is complicated to find the right server and application which generates the real error, because most WebSphere Applications (like IBM Connections or Sametime) are installed on different Application Servers and Nodes. So you need to open multiple large files in your editor, scroll each to the needed timestamps and check the lines before for possible error messages.

Klaus Bild showed on several conferences and in his blog the functionality of ELK, so I thought about using ELK too. I started to build a virtual machine with ELK Stack (Elasticsearch, Logstash & Kibana) and imported my local logs and logs i got mailed. This way is cool to analyze your environment, but adding just some SystemOut.logs from outside is not the best way. It’s hard to remove this stuff after analyzing from a ELK instance.

Then I found a tutorial to install ELK with Docker, there is even a great Part 2 of this post, which helps us installing an ELK Cluster. Just follow this blog posts, install Docker and Docker-compose. It’s really fast deployed. In my case i do not use flocker, it’s enough to use a local data container for the elasticsearch data.

Why do I use Docker instead of a virtual machine? It’s super easy to just drop the data container and begin with an empty database.

My setup

I created a folder on my Mac to put all needed stuff for the Docker images to it:

mkdir elk
cd elk

(more…)