AstroLabNet

00.05.00 [15/Jul/2019 at 14:42:34 CEST by hrivnac]


AstroLabNet is a front-end to the archipel of Spark services.
It allows to Full functionality is available via CLI, GUI and WS.
Full Application Help
executing interactive Action on the Spark Server
interactive Web Service
sending Job to the Spark Server
searching Spark Server Journal
searching Spark Server Catalog

Application is also available as a Web Service.
Session on the Spark Server
Spark Server Journal
Spark Server Catalog

After direct download from AstroLabNet.exe.jar. It can be called like
java -jar AstroLabNet.exe.jar
The full help:
$ java -jar AstroLabNet.exe.jar -h                
usage: java -jar AstroLabNet.exe.jar
 -b,--browser          start graphical browser (default)
 -c,--cli              start command line
 -h,--help             show help
 -p,--profile <name>   use existing site profile Local (default) or LAL
 -q,--quiet            minimal direct feedback
 -s,--source <file>    source bsh file (init.bsh is also read)
The application will try to connect to your local Server and from it to other know Servers.
If you want to connect directly to another server, you should write a custom init.bsh file in your run directory:
w.addServer("LAL", "http://vm-75222.lal.in2p3.fr:21111", "http://vm-75222.lal.in2p3.fr:20001", "http://134.158.74.54:8080");
Examples of BeanShell scripts to load (or type on the command line):
// Import Avro Alerts from file or directory
// -----------------------------------------
import com.astrolabsoftware.AstroLabNet.Avro.AvroReader;
Server server = w.server("Local");
AvroReader reader = new AvroReader(server);
reader.process("../data/test/alarms/697251920015010010.avro");
System.exit(1);
// Scan Full Catalog
// -----------------
Server server = w.server("Local");
System.out.println(server.hbase().scan("astrolabnet.catalog.1"));
System.exit(1);
// Scan Full Journal
// -----------------
Server server = w.server("Local");
System.out.println(server.hbase().scan("astrolabnet.journal.1"));
System.exit(1);
// Scan Full Topology
// ------------------
Server server = w.server("Local");
System.out.println(server.hbase().scan("astrolabnet.topology.1"));
System.exit(1);
// Create and Send Action to Spark server
// --------------------------------------
Action action = w.addAction("PythonTestAction", "1+1", Language.PYTHON);
Server server = w.server("Local");
String result = server.livy().executeAction(action);
System.out.println(result);
System.exit(1);
// Create and Send Job to Spark server
// -----------------------------------
Job job = w.addJob("JavaTestJob", "../lib/JavaPiJob.jar", "com.astrolabsoftware.AstroLabNet.DB.Jobs.JavaPiJob");
Server server = w.server("Local");
String result = server.livy().sendJob(job);
System.out.println(result);
System.exit(1);
// Send Action to Spark server
// ---------------------------
Action action = w.action("ScalaPiAction");
Server server = w.server("Local");
String result = server.livy().executeAction(action);
System.out.println(result);
System.exit(1);
// Send Job to Spark server
// ------------------------
Job job = w.job("ScalaPiJob");
Server server = w.server("Local");
String result = server.livy().sendJob(job);
System.out.println(result);
System.exit(1);


How to setup private server

  1. Install:
    1. Spark
    2. Livy
    3. HBase
  2. Start all servers:
    ${SPARK_HOME}/bin/spark-shell
    ${LIVY_HOME}/bin/livy-server start
    ${HBASE_HOME}/bin/start-hbase.sh autostart
    ${HBASE_HOME}/bin/hbase-daemon.sh start rest
    
  3. Create and prefill HBase tables:
    ${HBASE_HOME}/bin/hbase shell < ../src/ruby/createCatalog.rb
    ${HBASE_HOME}/bin/hbase shell < ../src/ruby/createJournal.rb
    ${HBASE_HOME}/bin/hbase shell < ../src/ruby/createTopology.rb
    ${HBASE_HOME}/bin/hbase shell < ../src/ruby/fillTopology.rb
    




Bugs, Inconsistencies and ToDos

Bugs Inconsistencies ToDos

Release History

Related Documentation

Presentations


J.Hrivnac 15/Jul/2019 at 14:42:34 CEST by hrivnac