Alternative Computing Technologies (ACT) Lab

School of Computer Science

Georgia Institute of Technology

Tabla is an accelerator generator framework for geometric machine learning algorithms. It is an open source project under the Apache license.


About TABLA

TABLA leverages the insight that many geometrical machine learning algorithms can be expressed as stochastic optimization problems. Hence, the programmer's only job is to provide a high level implementation of the gradient of the objective function, while letting the framework generate the accelerator. TABLA framework constitutes four components.


Programming Language

Our in-house programming language makes it significantly easier for programmers to specify the gradient of the objective function. It includes language constructs and keywords that are commonly seen in several machine learning algorithms. The language allows programmers to declare different types of data, such as model input and output.

Learn more! »


Design Builder

TABLA consists of accelerator templates that can dynamically be added to the acceelerator architecture. Programmer-specified machine learning algorithm determines the configuration of the accelerator template. These templates, designed by expert hardware designers, are in synthesizable Verilog format. The design builder generates the accelerator and its interfacing logic using these accelerator templates. Specifically, the final output of the design builder is a set of synthesizable Verilog code. Internally, it uses a predesigned Verilog template, along with the programmer-specified gradient function and a high level specification of target FPGA platform.

Learn more! »


Model Compiler

The model compiler statically generates the execution schedule for the accelerator. Internally, it first generates the data flow graph of the gradient function, an effective tool for analyzing data dependency. Then, the most optimal schedule is generated by using ML-RCS algorithm. This ensures the algorithm is executed with the maximum level of parallelism.

Learn more! »


License

This source code is published under the terms specified in the Apache license.

Copyright 2016 Hadi Esmaeilzadeh

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.


Patrons