Sunday, September 17, 2017

How I build fast a dashboard (docker, docker compose, json-server and smashing)

Main points

I use smashing.io, to be fast and to avoid install i run it as (in) a container.

This solution allow me to avoid ruby install and the specific gems needed for smashing run, and libXXX-devel and libZZZ and so on.

the target

  • add a pie chart board widget
  • relying on an API JSON

Start it with docker-compose

I will use a docker compose for development :
  1. one for smashing
  2. one for fake my data server -json server)
the docker compose that i use is (docker-compose.yml) :
version: '2'
services:
   dashboard:
      image: "rgcamus/alpine_smashing"
      ports:
         - "8080:3030"
      networks:
         - localnetwork
   jsonserver:
      image: "williamyeh/json-server"
      volumes:
         -  /home/franckys/03_DATA/workspace/fakeData:/data
      command:
         --watch db.json
      networks:
         - localnetwork
      expose:
         - "3000"
networks:
   localnetwork:
      driver: bridge

I don't use image volume for the dashboard. I  want to have a "self contain" image. So the dashboard source will be embed in the image and for security I will use git to checkout the board sources.

For dev I use "docker cp src target" to inject modifications in my dashboard container.

First run

with command line a  "docker-compose up" does the job. In your favorite browser connect on locahost:8080 and you will see the default dashboard :

Build the board

fake a json API

 I my case I want to connect my board to a Json api. Here I don't want to show how to connect to the API , but how to fake it.

npm json-server

To fake my server I use json server through npm.
Just defining a json.db file and start it allow you to access a faked REST API.
In this docker container case, the file should be in the folder mounted as volume. In the previous docker-compose it means in :
/home/franckys/03_DATA/workspace/fakeData


Here this is my file contend (db.json) :
{
  "issues": [
      { 
      "id": "FRKY_10",
      "epicLink":  "FRKY_3",
      "summary": "story one",
      "storyPoint" : 3
      },
      { 
      "id": "FRKY_11",
      "epicLink":  "FRKY_3",
      "summary": "story two",
      "storyPoint" : 3
      },
      { 
      "id": "FRKY_1",
      "epicLink":  "FRKY_2",
      "summary": "story one",
      "storyPoint" : 2
      },
      { 
      "id": "FRKY_5",
      "epicLink":  "FRKY_12",
      "summary": "story twelve",
      "storyPoint" : 5
      }
     
  ],
  "epic": [
      { 
      "id": "FRKY_3",
      "title": "EPIC01"
      },
      { 
      "id": "FRKY_2",
      "title":  "EPIC02"
      }      
  ]
}

If you want to see it in action , replace in docker-compose.yaml : 

      expose:

         - "3000"

by 

      ports:

         - "3000:3000"

the previous setup :
  • bind the container port 3000 to the on host (3000 port). 
  • remove the 3000 exposure on the docker network : localnetwork
Then, use your favorite browser and go through :
http://localhost:3000/epic and you should obtain :
[
  {
    "id": "FRKY_3",
    "title": "EPIC01"
  },
  {
    "id": "FRKY_2",
    "title": "EPIC02"
  }
]


Don't forget to revert this modification.

Using datas

Smashing rely on a scheduled job to get datas. Just have a look in the directory jobs :

# identity your container instance id

docker ps 

# run a bash

docker exec -it ${YOUR_CONTAINER_ID} bash

# In the container bash

ls jobs

buzzwords.rb    convergence.rb  sample.rb       twitter.rb

As you could see it's in Ruby.  So, to use our fake data we have to :
  1. make a http request
  2. process data to send it to your board

Where is my pie chart

So first I will use this widget :
Google Visualizations Pie Chart  so first we should install it, use it in the board then provide it the data

Install it

To install this widget we have to be be connected to the smashing container :
docker exec -it ${YOUR_CONTAINER_ID} bash

then install the wiget :
smashing install GIST_ID (see here )

Use it

All is explain in the widget documentation (here)

you have to :
  • modify the layout.erb file in /dashboard
  • modify the board. In my case I modify the sample.erb file
  • then add a job to access data (see next paragraph)

Provide it data

Providing data consist in writing a Ruby script to access my Json API and process the response in order to be able to send data in this format :
send_event('mychart', slices: [
        ['Task', 'Hours per Day'],
        ['Work',     11],
        ['Eat',      2],
        ['Commute',  2],
        ['Watch TV', 2],
        ['Sleep',    7]
      ])
So for Http Access

uri = URI.parse("http://compose_jsonserver_1:3000/samples")

request = Net::HTTP::Get.new(uri)

begin

    response = Net::HTTP.start(uri.hostname, uri.port) do |http|

    http.request(request)

end

parsed = JSON.parse(response.body)

# etc....



# send data to the wigdget, data is in arrayToSend



header = Array.new

header.insert(0, "header1")

header.insert(1, "header2")

arrayToSend.insert(0, header)

logger.info("ArrayEnd #{arrayToSend}")

send_event('mychart2', slices: arrayToSend)




Friday, October 14, 2016

Saving / and restoring Jenkins Job on Windows

A small post here, just as a reminder
I found a lot for linux bash but very few for windows... And what  I'm working with Windows 

Steps

You are working on Windows, you want to all your Jenkins job. So you have to
  • list all job
  • save the jobs

and then could be able to do the import...

Batch

the save batch :

java -jar jenkins-cli.jar -s http://localhost:8190/DEV/jenkins/ ...
...     list-jobs > list.txt
for /f %%i in (list.txt) do java -jar jenkins-cli.jar -s ...
...     http://localhost:8190/DEV/jenkins/ create-job %%i_new < %%i.xml 


Note the I create a file that contains all my jobs names. Latter,  when importing it will be easier to  use the real job name (no deduction starting form file name)

the import batch
 for /f %%i in (list.txt) do java -jar jenkins-cli.jar ...
...   -s http://localhost:8190/DEV/jenkins/ create-job %%i_new < %%i.xml


Depending of your setup you could be ask for security stuff.


that's all.


Thursday, June 16, 2016

Maven POST a zip through HTTPS using gmaven and HttpClient

What

Well in this blog post I put all necessary things to be able to post on a defined url a product.
This product could be provided by an assembly or (it's my case)  by a tycho repository build.

How

First of all we use maven and an integration chain. As I dont want to use a post script build in jenkins (it's external...)
So we have two main options :
  • build a Maven plugin
  • use something embed in the pom.
As we have very few site to push I target the second option and I use gmaven, that is allows to excute a groovy script so every thing is open.
To do the https post I use HttpClient from apache, this framework allow a good way to do post whith authentication, SSL and all those kind of things.

So the script :

import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.security.KeyManagementException;
import java.security.KeyStoreException;
import java.security.NoSuchAlgorithmException;
import java.security.SecureRandom;
import java.security.cert.X509Certificate;

import javax.net.ssl.SSLContext;
import javax.net.ssl.TrustManager;
import javax.net.ssl.X509TrustManager;

import org.apache.http.HttpEntity;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.conn.ssl.SSLConnectionSocketFactory;
import org.apache.http.conn.ssl.TrustSelfSignedStrategy;
import org.apache.http.entity.ContentType;
import org.apache.http.entity.mime.HttpMultipartMode;
import org.apache.http.entity.mime.MultipartEntityBuilder;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.ssl.SSLContexts;
import org.apache.http.util.EntityUtils;

class X509TM implements X509TrustManager {
 public X509Certificate[] getAcceptedIssuers(){
  return null;
  }

 public void checkClientTrusted(X509Certificate[] certs, String authType){}
 public void checkServerTrusted(X509Certificate[] certs, String authType){}
};

String url = pom.properties["http.url"];
if(url != null)  {
 log.info("Try to post site at url : " + url);
 FileInputStream fileInputStream = null;
 try {
  //TRUST ALL : bar but it should be a temporarly workaround
  X509TrustManager X509Tm = new X509TM();
  TrustManager[] trustAllCerts = [X509Tm] as TrustManager[] ;
  SSLContext sslcontext = SSLContexts.custom()
    .loadTrustMaterial(null, new TrustSelfSignedStrategy())
    .build();
  sslcontext.init(null, trustAllCerts, new SecureRandom());
  SSLConnectionSocketFactory sslsf = new SSLConnectionSocketFactory(sslcontext);
  CloseableHttpClient httpclient = HttpClients.custom().setSSLSocketFactory(sslsf).build();
 
  String outDir = pom.properties['outputdir'];
  String pname = pom.properties['pname'];
  String pversion = pom.properties['pversion'];
  HttpPost post = new HttpPost(url);
  File file = new File(outDir+"/"+pname+"-"+pversion+".zip");
  if(file.exists())  {
   log.info("multipart to post : " + file.getAbsolutePath());
   //create multi part
   MultipartEntityBuilder builder = MultipartEntityBuilder.create();
   builder.setMode(HttpMultipartMode.BROWSER_COMPATIBLE);
   builder.addBinaryBody("site", file, ContentType.MULTIPART_FORM_DATA, file.getName());
   HttpEntity multipart = builder.build();
   post.setEntity(multipart);
   //DO POST
   CloseableHttpResponse  response = httpclient.execute(post);
   HttpEntity entityRep = response.getEntity();
   if (entityRep != null) {
    String page = EntityUtils.toString(entityRep);
    log.info("PAGE :" + page);
   }
   EntityUtils.consume(entityRep);
  } else  {
   log.info("File "+file.getAbsolutePath() +" not found");
  }
  } catch (Exception e) {
  e.printStackTrace();
  if(fileInputStream != null )
   try {
    fileInputStream.close();
   } catch (IOException e1) {
    e1.printStackTrace();
   }
  }
} else {
 log.info("Post site is disabled (url is null)");
}


And finally the  pom. As you could see I set a maven profil activated on system property :

<project>
...
<profiles>
 <profile>
  <activation>
  <!--ACTIVATE through -Denvprops=Push-->
   <property>
    <name>envprops</name>
    <value>Push</value>
   </property>
  </activation>
  <properties>
   <http.url>${YOURURL}</http.url>
   <pname>${project.name}</pname>
   <pversion>${project.version}</pversion>
   <outputdir>${project.build.directory}</outputdir>
  </properties>
 </profile>
</profiles>

<build>
 <plugins>
  <plugin>
   <groupId>org.codehaus.gmaven</groupId>
   <artifactId>gmaven-plugin</artifactId>
   <version>1.5</version>
   <dependencies>
    <dependency>
     <groupId>org.apache.httpcomponents</groupId>
     <artifactId>httpclient</artifactId>
     <version>4.5.2</version>
    </dependency>
    <dependency>
     <groupId>org.apache.httpcomponents</groupId>
     <artifactId>httpmime</artifactId>
     <version>4.5.2</version>
    </dependency>
   </dependencies>
   <executions>
    <execution>
     <!-- To be after Zip generation -->
     <phase>install</phase>
     <goals>
      <goal>execute</goal>
     </goals>
     <configuration>
      <source>${project.basedir}/post.groovy</source>
     </configuration>
    </execution>
   </executions>
  </plugin>
 </plugins>
</build>
</project>

Sunday, May 29, 2016

Installing android playstore on a "custom" pad

My problem  :

  • I have a pad with custom OS, based on android 4.2.2
  • I do not have th play store installed 
  • I want one ! :-)

Disclaimer

WARNING all this post constains informations that works for me.
  • I download and do many things that could be risky...
  • I don't know if my donwload are safe or not 
  • I don't know if the command line I does will work and if they are safe (probably not cause I change credentials on /system....)
So if you reproduce this "receip" be aware that's at your own risk!
At least does a back up and save all your things... (if it's not done regularly)

The target

Install some file (apk) in /system/app :
  • GoogleServicesFramework.apk
  • OneTimeInitializer.apk
  • GoogleLoginService.apk
  • GoogleFeedback.apk
  • Phonesky.apk

Get those files

I download a zip for gapp (google application), it 's name like :gapps-jb-20130812-signed.   
I Download this one cause it's for the android 4.2.2 (the one that is serving for my device customization ).
Download those file as "pack", give me a coherent files association (compatible).
So, unzip then search the desired files, we will use them after adb install.

Install adb

ADB is an acronym for Android Driver Bridge : this piece of software allos you to take control of your device from you PC.

I  download  a file adb-setup-1.4.3
see http://forum.xda-developers.com/showthread.php?p=48915118#post48915118
for more explanation.

I try first to install "adb driver" but with win10 I had  problem with peripheral installation because the system (win 10)  enables driver signature enforcement by default. I try this but finally install " adb-setup"

Push those file

WARNING : see disclaimer chapter.
push each file using this adb comman line :
adb push GoogleServicesFramework.apk /system/app

repeat this for all files. Previously I transfer those apk to the device then click on each and try to install , but this not  work (GoogleServicesFramework.apk does not install at all...)
If you have a fail due to Read-only file system you have to :
use adb shell in a win10 administrator command line tool :
adb shell
then you have adb sheel (#)
# su
# mount -o,rw remount /system/app

At the end

I restart my device and all is working. I can now have access to android store.

Thursday, March 31, 2016

Java Graph visualization and EMF


The begining

I'm actually working in a integration test team. The main goal of this team is to make software parts assembly ant then test them in a nominal to limit way.
To acheive this task they build data :  for input, result validation  and non-regression test data. This team work with (space) flight dynamics algorithm and the manipulated data are
  • numerous
  • have to make a consistent whole
These datas are in fact organized as a graph and a tree view does not always fit to validate/control these data...

One solution : a graph view

We work with an EMF (Eclipse Modeling Framework) data implementation. So I start reusing a generic editor  (based on a reflexive item provider, the source could be found here) and adapt it to fit our needs. But I feel that is not enough to validate, saw data.
At this point I remembered my work on  graph database named OrientDb. This tool has a graph editor and it was cool to see cross references ans so on...
So I searched a java graph visualization tool. I found fast JUNG (Java Universal Network/Graph Framework) that fit my "first order" needs :
  • easily managed graph
  • many provided layout (see KKlayout, FRLayout, SpringLayout, CirculeLayout and more...)

Usage

 The usage is quite easy, JUNG provide a sample project that is very usefull and run quite well.

Eclipse view Integration

My integration is actually very direct. I use  :
  • the AWT_SWT eclipse bridge to integrate visualisation in a view
  • a command in the generic editor popup menu to invoke an handler that call  the refresh and the show graph view.
  • rely on reflexive content provider to build the graph
  • set a color for each datatype (we have about 1000)
TODO Png

Next

  • More integration to allow highlight from generic editor to graph view and opposite
  • set a short name for vertices and tool tip for contents
  • set edge view depending of containment or not
  • Optimize memory usage (actually no more than one hundred vertices... due to sharing memory with eclipse ? )



Friday, February 5, 2016

Building a small Arduino powered robot

It's time to play Arduino and Motor Shield

I received many weeks ago enough part to build a small arduino robot, and I have nos the time. Let's build it...

For this project I have no real objective. I just want to asseble an autonomous robot, avoiding wall or other object.

The parts

  • Arduino uno for control-command
  • Seeed motor shield V2 for power and simplifaction of command
  • An HC-SR04 ultra-sonic sensor for objects detection
  • A 9g servo motor for ultra sonic sensor rotation
  • battery  clip connector 9v to 2,1 power plug (see here to build or buy )
  • a chassi 2wd motor, with bolts and nuts, wheel, two small DC motor, a four battery pack AAA some wires and an end wheel
 

The assembly

Nothing very important to say here. Try to assemble, see what goes wrong, re assemble , what goes wrong and so on.


The only thing to say is that I make parts work separately first :
  • the HC-SR04 ultra-sonic sensor
  • then the servo 
  • then the motor shield
 And finally integrate. And at this point it does not work...  as usual ...
  • Motor shield use pin 8 to  13 to work 
  • AND servo (library) disable PWM for 9 and 10 pin

So it does not work. The solution is to change the pin mapping in motor shield libraries from 9,10 to 5,6 and connect them physicaly (not shown on this picture).
I saw this on a blog here. It works for me too ,but if you  try, you try at your own risks.

Servo

Library used : arduino  Servo
#include "Servo.h"
const int SERVO_CMD = 14;
void setup() {
     myServo.attach(SERVO_CMD);
}

and
myServo.write(angle);

MotorShield (Seeed v2)

Library used : seeed MotorDriver modified in  MotorDriver56 to use pwm 5,6 in place of 9 and 10

include "MotorDriver56.h"
void setup() {
      motordriver.init();
      motordriver.setSpeed(200,MOTORB);
      motordriver.setSpeed(200,MOTORA);
}
and
motordriver.goForward();
motordriver.goBackward();
motordriver.goLeft();
motordriver.goRight();
motordriver.stop();

typical use

motordriver.go${Action}();
delay(time);
motordriver.stop();

HC-SR04

use
   const int trigPin = 4;
   const int echoPin = 7;

and
   int calculateDistance(){ 
      digitalWrite(trigPin, LOW); 
      delayMicroseconds(2);
      // Sets the trigPin on HIGH state for 10 micro seconds
      digitalWrite(trigPin, HIGH);
      delayMicroseconds(10);
      digitalWrite(trigPin, LOW);

     // Reads the echoPin, returns the sound wave travel time in microseconds
      duration = pulseIn(echoPin, HIGH); 
      int distance = duration*0.034/2;
      return distance;
    }

The basic control / command


if (nothing front) {
   goFront()
} else if ( something left ) {
   goBackward()
   goRight()
} else {
   goBackward() 
   goLeft()
}

But ...

My small rover does not go straightaway.  The reasons ... motor are differents, the end wheel is not so good. Those two points cause deviations.

Next improvment 

  • the tail whell cause trajectory deviation. Need to change to a round ball, or use chains
  • add encoder measurement to have a better distance/movement  estimation
  • there's no sensor at back (securing goBackward() ) 
  • Use arduino interrupt

Ultimate goal

Transform this rover into sensor. I mean acquire distance measurement, send it via ESP8266 (wifi) to raspberry Pi.  And finally build a 3D map of the rover context.

Monday, February 1, 2016

Stream, Image processing,Java 8 and parallel

During last few days I decide to go further with Java 8 stream, lambda. As I currently work on differents projects set to Java 5 compatibility I does not have a lot of "material" for experiment.

Learn

So I began, as usual with google to find out example and/or tutorial.
I mainly found cases of studies starting with simple list, applying mapping, filtering and finally reducing.

Those examples are very usefull and well explain and were helpfull as starting point :

Apply

Now it time to try ourself.  I do not want to work with case of studies beacause I want to think about about the solution and find myself how solve with stream a given problem. And, I want to make something (I hope) usefull (almost for me...).
So which example ? A long time a ago, I worked on image processing, segmentation, morphology transformation,  skeletonisation, hough transform,  on color (TLS, RVB), gray level or binary images.
An image is a good client for my streaming example :
  • could be usefull
  • could be enough big to see parallel effect
  • you  could easily imagine how to apply a structuring element on an image stream

Imagine the stream

As I want to apply a structuring element 3x3, 5x5, nxn I think about :
  1. walking on a stream  of vectors building from original image. Those vectors has the same dimension as the structuring element.
  2. iterate on an Integer Stream of (height x width) of image for limit, that allow to build the target image
I give up the first solution beacause :
  • the memory foot print is four time the image size for a 3x3 kernel. 3 for stream + 1 for target.
  • the time for stream building is in excess and depend of the kernel size.

Stream coding

To do the second solution I have to :
  • iterate on a range [0 , heigh * width]
  • for each calculate the value resulting of kernel application
  • put it in the the resulting image.
I found to two way to implement this :

IntStream.range(0, (heigh * width)-1).forEach(n->{applyKernel(n)})
or
Stream.iterate(0, n -> n+1).limit(heigh * width) - 1).forEach(n-> {applyKernel(n)})

The applyKernel(int n) method is for  the target pixel resulting of kernel application ( for the pixel n)

To find the right way to implement,  I saw result time for the same image, the same number of time in the same machine, excluding the worst 2 cases for each. The winner is the first solution with 880 ms compare with 1000ms.

Why?  I think It's because IntStream is provide for this purpose exactly -ie providing an int range. Unlike Stream.iterate() with (plus) limit() is done to generate a stream by applying the given function on the seed. It's something that is more sophisticated. Another point could be that the limit method produce another stream(). So at the end I think there is more "mechanics" in the second solution that implie less speed.

Processing image

In this example to process image I
  • load it as a BufferedImage
  • convert it to gray (byte) 
  • filter it with an octagonal 
  • convert it to binary relying on DataBufferByte
  • 2 closing (dilate x 2 then erode x 2)
  • then edge detection 

Load image

File imageFile = ... 
BufferedImage img = ImageIO.read(imageFile);

Convert to byte gray

BufferedImage  grayImg= new BufferedImage(widht, heigh, BufferedImage.TYPE_BYTE_GRAY);
BufferedImageOp gsOp = new ColorConvertOp(
          imageToConvert.getColorModel().getColorSpace(),
          grayImg.getColorModel().getColorSpace(),null);
gsOp.filter(imageToConvert, grayImg);

Filter with Octagon kernel

It is the application of this kernel
|1,1,1|
|1,1,1|  divide by 9
|1,1,1|

It means that the resulting pixel is the average of  itself and the 8 pixels around it.
To do that I use the
byte[] imageAsByteArray =  ((DataBufferByte)image.getRaster().getDataBuffer()).getData();

Convert to binary

It means  iterate on gray scale byte[] and set resulting pixel to 0 or 255 depending of the imput value.
If the imput value is more than a threshold  the valur will be 255 else 0.

Closing

for erosion and dilatation I use a CROSS kernel
|false,true,false|
|true ,true, true|  
|false,true,false|

For dilatation I apply this  on binary image (255 is true, 0 is false) if the kernel application (logical OR) on the given image contexte return true then the result is 255 , 0 otherwhise.
For erosion this is the same kernel but the application use a logical AND.

The closing operator effect is filling small hole in image. The hole size fill up depend of the kernel size an the number of succesive erosion, dilatation.

Edge detection

On binary image I apply the laplacian square using multiplication.
|0, 1,0|
|1,-4, 1|  
|0, 1,0|

Stream and parallel

Now it's time to test parallel feature. I red a lot of bad thing about parallel but it seems that all those "problems" always appear in concurrent enviromnent. See :

The big deal is that stream allows nows to acheive those easily/instantly without need to use Thread or other framework, just add .parallel() and you got it.
How to use it in my example. I just do like that :


IntStream.range(0, (heigh * width)-1).parallel().forEach(n->{applyKernel(n)})

If you want to control the number of thread use (see here for more informations) :

ForkJoinPool forkJoinPool = new forkJoinPool(4);
... 
forkJoinPool.submit(() -> 
IntStream.range(0, (heigh * width)-1).parallel().forEach(n->{applyKernel(n)})
).get()

At the end the time is cut by 2 going from 880 ms to 430 ms to does erosion.
So nice and easy. So is there any problems around there ?

Well as I understand problem could come from :

Conclusion

I have now a better understanding on Java stream pro and cons.  Enough to think about Stream the next time I face an analoguoud context.