Computer Laboratory

TESLA

Getting started

For the moment, TESLA requires a bit of effort to build: it has a very heavyweight dependency (recent LLVM/Clang). The steps required to build are:

  1. acquire prerequisites
  2. build TESLA
  3. install TESLA (optional)

Prerequisites

Here is a table of dependencies and the names of their ports/packages on various platforms:

Name FreeBSD Mac OS X Linux
HomeBrew MacPorts Ubuntu Fedora
C++ compiler base system XCode (with Command Line Tools) clang gcc-c++
LLVM 3.3 llvm-devel build from source
Clang 3.3 clang-devel
libexecinfo libexecinfo included with libc
Git git git-core git
Subversion subversion
Ninja devel/ninja ninja ninja-build from source
CMake cmake cmake --enable-ninja cmake
Protocol Buffers protobuf protobuf-cpp libprotobuf-dev
protobuf-compiler
from source

Building LLVM/Clang from source

FreeBSD provides LLVM and Clang 3.3 as a port (see prerequisites, above), but on other systems, you will likely need to build Clang/LLVM from source.

$ cd somewhere/to/stash/1GB/of/LLVM $ svn co http://llvm.org/svn/llvm-project/llvm/branches/release_33 llvm [...] U llvm Checked out revision 175226. $ cd llvm/tools $ svn co http://llvm.org/svn/llvm-project/cfe/branches/release_33 clang [...] A clang/LICENSE.TXT U clang Checked out revision 175226. $ cd ../.. $ mkdir build $ cd build $ cmake -G Ninja -D CMAKE_C_COMPILER=clang -D CMAKE_CXX_COMPILER=clang++ ../llvm # or gcc/g++ -- The C compiler identification is [...] -- Check for working C compiler using: Ninja -- Check for working C compiler using: Ninja -- works [...] -- Generating done -- Build files have been written to: /home/jonathan/LLVM/build $ ninja [1932/1932] Linking CXX executable bin/c-index-test

Build TESLA

Once you have LLVM, make sure that it's at the front of your PATH:

$ export PATH=/path/to/LLVM/build/bin:$PATH $ llvm-config --libdir # test that llvm-config works

Next, we download and configure TESLA:

$ git clone https://github.com/CTSRD-TESLA/TESLA.git tesla Cloning into 'TESLA'... remote: Counting objects: 3138, done. [...] $ cd tesla $ mkdir build $ cd build $ cmake -G Ninja -DCMAKE_C_COMPILER=clang -DCMAKE_CXX_COMPILER=clang++ .. -- The C compiler identification is Clang 3.3.0 -- Check for working C compiler using: Ninja -- Check for working C compiler using: Ninja -- works -- Detecting C compiler ABI info [...] -- Found PROTOBUF: /usr/local/lib/libprotobuf.so -- Configuring done -- Generating done -- Build files have been written to: /home/jonathan/TESLA/build

If the CMake command fails because of libc++ link failures or an inability to find LLVM-Config or AddLLVM, you may need to run one of:

$ cmake -D USE_LIBCXX=false . # if you have libc++ but didn't link LLVM against it $ cmake -D CMAKE_MODULE_PATH=/<LLVM prefix>/share/llvm/cmake . # if CMake can't find LLVM-Config or AddLLVM

Then we can build TESLA and test it:

$ ninja [50/50] Linking CXX executable tesla/tools/read/telsa-read

After that, you'll want to be able to actually run TESLA!

Install TESLA (optional)

TESLA can be run in-place from the build directory, but you may prefer to install it on your PATH:

$ cd build $ cmake -D CMAKE_INSTALL_PREFIX=/some/sensible/path . # or else things go in /usr/local $ ninja install [1/1] Install the project... -- Install configuration: "Debug" -- Up-to-date: /some/sensible/path/lib/libtesla.so -- Installing: /some/sensible/path/bin/tesla-analyse -- Installing: /some/sensible/path/bin/tesla-instrument -- Installing: /some/sensible/path/bin/tesla-graph -- Installing: /some/sensible/path/bin/llvm-triple -- Installing: /some/sensible/path/bin/tesla-read -- Installing: /some/sensible/path/bin/tesla $ export PATH=/some/sensible/path

Having either set variables or installed TESLA, you should now be able to run TESLA commands:

$ tesla graph -help USAGE: tesla-graph [options] <input file> OPTIONS: automata determinism: -u - unlinked NFA -l - linked NFA -d - DFA -help - Display available options (-help-hidden for more) -o=<string> - <output file> -version - Display the version of this program