Synaptic The Peltarion Blog

22Jun/140

Peltarion DeepGrid

deepgrid_small

We're happy to introduce Peltarion DeepGrid, a high perfomance platform for massive Deep Neural Networks.

  • State of the art deep neural networks (DNNs)
    • Fully connected with dropout, maxout & ReLU
    • Denoising and stacking autoencoders
    • Convolutional nets
    • Recurrent nets
  • Massive size and data support (millions of neurons and gigabytes of data)
  • Trained on GPU:s with 1000s of cores.
  • Trained on Peltarion servers and scalable across Amazon Elastic Compute Cloud (EC2) clusters.
  • Easily integrated into Peltarion Concurrent Module System
  • Compact deployment (training takes massive computational power but execution of a trained system is fast)

DeepGrid @ EC2

Peltarion Deep Grid runs on Amazon EC2 clusters allows us to give our users massive, scalable and very affordable computational power. The basic GPU instance has 1,536 computation cores and hundreds and even thousands or tens of thousands instances can be used in parallel.

Example: A DNN that takes 10 hours to train on a single GPU (1536 cores). It is trained in 1,000 variations in order to optimize hyperparameters (network size, learning rates etc). Training time:

  • Amazon EC2 (30 instances): 14 days
  • 1xGPU: 1.14 years
  • Octacore CPU with 16 hyperthreads: 45.6 years
  • Single core CPU: 730 years

Using Peltarion DeepGrid to train your DNNs is several orders of magnitude cheaper than doing it on your own hardware.

Workflow

The typical DeepGrid workflow looks like this:

workflow

  1. A DNN is trained using DeepGrid and your data (we help you do this)
  2. The trained DeepGrid system is deployed to Peltarion Synapse where it can be combined with other models
  3. The system in Synapse is deployed to CMS or an external application
  4. The DeepGrid model can later be deployed directly to CMS without the need of going through Synapse

For more information, please contact us at system-solutions@peltarion.com

 

22Jun/140

Deep Learning and Deep Neural Networks in Synapse (Part 1)

Deep learning has with good reason gotten a lot of attention recently. For a great introduction see these two talks by Geoffrey Hinton:

Really quick intro to Deep Learning

There are a lot of good extensive tutorials out there, but we'll give a very compact description here and move on to the practical stuff. A regular neutral network consists of layers:   Neural_network_layers You have an input layer, an output layer and a number of "hidden layers".  This is how it looks in Synapse: nnsyn A neural network with multiple hidden layers is called a Deep Neural Network or DNN. The traditional approach has been to randomize all the weights in the network before you start training it. Then you set input data at the input layer. You send the input data through the network and see what comes out at the other end. You compare that output to your desired output and calculate an error (the difference between the actual output and desired output). That error is then sent back through the network, updating the weights. After doing this a lot of time the error will become smaller and smaller. Each layer in a neural network is a feature detector.

Feature detectors

Suppose you are trying to teach a neural network to differentiate between motorcycles and faces in an image. You have a bunch of pixels as input and a 0 (=motorcycle in the photo) or 1 (face in the photo) as the desired output. If you use a single layer neural network with just one sets of weights (=one weight per pixel), you are essentially trying to classify each pixel as a "motorcycle pixel" or a "face pixel".

 motor1

As you can imagine this is basically impossible. For instance a black pixel could be part of just about anything. A much more useful thing would be if we had a "wheel detector" and perhaps a "handle detector".

motor2

This is what neural networks actually can do. In theory a deep neural network is a universal function approximator - meaning that it can basically capture any input<->output relationship. That's why everyone was so excited about neural nets in the 80's. The problem was that they never really lived up to the expectations. They were good at solving some problems, but usually only slightly better or as good as conventional methods. They were never this amazing universal solution that people had hoped they would be. As it turns out the hopeful proponents of neural nets were not wrong at all. There was just three things missing back then: computer power, large data sets and some slight improvements in training technique. The latter is the "deep learning" technique which as of 2013 is basically sweeping the floor with all other machine learning methods. I very strongly recommend watching one of the two videos posted at the beginning of this post. Anyway, the slight improvement in technique that made a big difference was that if you initialize the weights in a neural net in a clever way you can get much better results than if you start with just random weights. This goes especially if you have big and redundant data sets. Instead of doing a direct Input->Desired Output training directly, you update layer by layer by using auto-encoders to initialize weights. An auto encoder is a really simple thing - it's a small neural network whose desired output is the same as its input. This is a way of encoding knowledge in weights. If you use few weights, then it's a form of compression. If you use many weights, it's a form of distributed expansion. Here's a simple auto-encoder in Synapse: autoencoder The weight layer 1) is the encoding matrix and 2) is the decoding matrix. We have an input signal that is compressed from 122 variables to 20 variables and then again expanded to the full 122 variables. That means that with this auto encoder we can compress the data about 6 fold. The encoding matrix is forced to learn the patterns that are in the data. It builds what is called a feature detector. The key with deep neural nets is that these feature detectors can be stacked, providing a new level of abstraction at each layer.

fdet1

A functional view of a DNN would be like this:

DNN
Stay tuned for Part II

Filed under: Synapse, Theory No Comments
20Jun/130

Summer of Synapse 2013

Summer is here and Peltarion would like for you to enjoy it as much as we do. Therefore we are currently offering a limited number of Synapse Licenses at half price during June and July! A number of licenses each month are automatically sold at half price. There is only a limited number though, so act while there is still time. Are there discounted licenses left for this month? Visit our web shop and see for yourself!

Expired trial?

Did you try Synapse two years ago and are interested in testing the great improvements and new features that we've added since then? Or do you simply want a bit of more time to evaluate it?

No, problem, in coordination with the "Synapse Summer Special" campaign we are releasing a trial extender that will give you a new 30 days of trial. You can use the Trial Extender between 2013-06-19 and 2013-07-15 to get 30 more days of Synapse time starting on the date when the trial extender is installed.

To extend the trial, follow these steps:

  1. If you don't have it already, download the latest version of Synapse here and install it.
  2. Download the trial extender plugin here.
  3. Extract the TrialExtender.dll file from the zip file into the Synapse plugins directory (typically found in C:\Program Files\Peltarion\Synapse\Plugins).

Start Synapse and enjoy another 30 day trial.

Filed under: Synapse No Comments
25Feb/130

Synapse 1.6.0 Released

We've released Synapse 1.6.0. Apart from all the cummulative updates since 1.5.2 it includes:

 

  • Improved support for Windows 8 and new Microsoft secuity guidelines
  • Performance improvements, speed and memory
  • The sensitivity analyzer now can show relative and absolute output sensitivity, in addition to the original error sensitivity.

 

Sensibility Analyzer

 

If you have Synapse installed and automatic updates enabled, you'll get the 1.6 version just by starting up Synapse (you might need to right click, "Run as administrator" if it won't start the new version after upgrade). Customers can download the full installation package from the customer area. A 30-day evaluation version can be downloaded here.

Filed under: Synapse, Updates No Comments
25Feb/130

New forum policy

The Peltarion Forums are online again. We turned them off for a time while due to the excessive levels of spam. CAPTCHAs have turned out to be rather useless as the spammers use human beings to crack them.

With all the spam the forums turned out to be more work than it was worth. Only a very small percentage of users ever even looked at them, much less posted. From customer surveys we've figured out that the primary reason for this is that most of our customers are using it for commercial purposes. They tend to be protective of their IP and not really interested in participating in an internet community. The ratio of support emails that we get vs request for support through the forums is perhaps 100:1 if not more.

Ultimately we decided to re-open them again, but not accepting any new outside registrations except from customers.

If you are a customer, log in to your account, go to the support section and click on the email link. Just write that you want an account in the body of the email. For now we'll process the requests manually - we may automate it in the future.

In the future we may allow the registration of non-customer users in some way, but right now we're taking it one step at a time.

 

Filed under: General No Comments
23Aug/120

Using a system trained in Synapse

One commonly asked question is how to best use a trained system with new data. There are three basic approaches:

  1. Using the probe post-processing component .
  2. Deploying the system (covered in detail by tutorial #2)
  3. Loading the new data into Synapse

-

As the first two ways are covered by existing tutorials, this post will cover 3) - how you load and and use new data within a solution in Synapse.

25Jun/120

Synapse 1.5.2 Released

We are happy to announce that Synapse 1.5.2 has been released. This update contains a lot of improvements under the hood with optimization of the basic math libraries as well as the multicore processing parts. There have however been two updates that directly affect the usability.

In training you will notice that there is a new button in the toolbar:

This is the "Load Best System" button and behind it is an awesome new feature. One common problem with adaptive systems is that you seldom know how long you should train them - too few epochs and it won't learn, too many and it will start over-training (losing its capability of generalizing). With the Load Best System function you don't have to worry about that any more. Synapse now keeps track of  all the states during training and remembers each time an improvement has occurred.  Pressing the "Load Best System" button loads the best system up to date (as measured on the validation set). So it doesn't matter if the system has begun over training or is diverging -  you can simply with the click of a button get the best state.

We've added new functions to the preprocessing  mode GUI that will help to keep things organized when dealing with multiple data units and visualizers.

There are two major new things:

1) Colorized data units and visualizer tabs. A visualizer tab gets the same color as the data unit it is showing.
2) You can now rename tabs by right-clicking on them.

The 1.5.2 update is available as an automatic update only. Just start Synapse and unless you have turned off automatic updates you'll get it automatically.

-The Peltarion Team

Filed under: Synapse No Comments
25Jan/120

Consuming deployed Synapse components from C++

Sometimes you wish to use a deployed Synapse component from software not written in a .NET language. If the software is written in C++ or can make use of native DLLs you can still make use of deployed Synapse components either directly from C++ or by creating a C++ wrapper that exports key functionality.

This post shows how this can be done by writing a simple application in C++ that uses the deployed component from the famous police tutorial.  For the steps described in this post we will not need Synapse but will use Microsoft Visual Studio 2010 (Make sure it is the C++ version if you use the express edition).

In order to manage our .NET object in the unmanaged world we will make a class called Client that will handle instantiation and destruction of an instance of the .NET object (in our this case Peltarion.Deployed.GodCopBadCop). In this class we can also implement any access methods to provide communication with the managed object.

We will then use this Client class to instantiate an instance of the deployed Synapse component and evaluate a test sample from the police data file.

Fist off start Visual Studio and create a new C++ project. For this exercise we will use a Win32 Console Application. I'm calling this project NativeSyn. (Check empty project in the second page before you click finish.)

I have copied the Synapse deployment directory GodCopBadCop to the project folder for easy access. To use it though we must add a reference to it. In the Solution Explorer select the NativeSyn project and hit Alt+Enter to bring up the properties window.

Turn on Common Language Runtime Support under configuration properties

Then add a reference to the dll containing the deploys Synapse system. For us that will be GodCopBadCop.dll.

(If you just want the project and source files to play with on your own you can find them at the end of the post)

Now let's add a header file called Client.h and start crafting our Client class. We will need a private void *ref to keep track of the managed .NET object. (We can't point to it directly since it's in managed space but we will get to that.) We will also need an alloc and a free method that will be called from the constructor and destructor respectively. Finally we will need a test method to facilitate the testing of Dutch policemen.

For the actual implementation add a source file called Client.cpp.

Include windows.h , Client.h and in order to comunicate with the managed world, vcclr.h. We will also be using namespaces System and System::Runtime::InteropServices.

Next is the implementation of the constructor and destructor. This is particularly easy as all they have to do is call alloc() and free(). We will however add a #pragma unmanaged prior to their implementation.

The rest of the code will be devoted to dealing with managed interop so #pragma managed this time. To run anything .NET related we need a using directive to mscorlib.dll and we will need another to GodCopBadCop.dll.

The type we go through all this trouble for is Peltarion.Deployed.GodCopBadCop in C++ that becomes Peltarion::Deployed::GoodCopBadCop. Since it is quite long to type let's make a typedef.

alloc(), now the interesting things start. Our .NET type is managed and subject to the CLS garbage collector. If we just instantiate an instance of it and there is no managed pointer to it we can be pretty certain the GC will destroy and collect it so will need to notify the GC we want that object to stay alive. This is done through the concept of internal pointers and GCHandles. I will not go in to details here but the process is to use gcnew to create an object instance and then register it with a GCHandle and finally store a pointer to the GCHandle.

Now, every time we need to use the object we need to go through the GCHandle to find it. When we want to tell the GC we are done with the object, and that it can be safely collected, we need get our GCHandle and call free on it.

The implementation of alloc() and free() looks like this:

The test method also need to access the object so we need to use the Target property of the GCHandle to get an internal pointer to it. After that you can use it access all the methods of the .NET object. We will primarily be interested in the Set_CSV and the StepEpoch methods along with the properties exposing the system output (on this system the output function layer is called FunctionLayer4 so we will access the FunctionLayer4_Port0 property).

Using the Client in an application is strait forward. We add a main.cpp source file to our project, include Client.h and we are god to go. (If you want you can turn off Common Language Runtime Support for the main.cpp file and all other files that don't do explicit interop stuff.)

Now, the moment of truth! Ctrl+F5

This was a small example to get you going. You might want to mimic some of the deployed Synapse components API or if this was a wrapper class to cater for some other interop you would probably export methods for creation, destruction and access methods that suite that particular platform.

The Visual Studio solution that was created for this post can be downloaded here and Microsoft has published a video tutorial covering this same issue over at http://msdn.microsoft.com/en-us/visualc/Video/bb892742

19Aug/110

Synapse 1.5 released

Synapse 1.5 has been released and is available for download.

Main highlights:

  • Full multicore support for optimizers.
  • A new Fully Forward Connected Perceptron Array (FFCPA) advanced neural network block.
  • New non-linear gradient optimization algorithms.
  • Improved SQL interoperability and data loading performance improvement.
  • Control system customization (for deployment)
  • Many performance improvements and bug fixes.

-

If you have Synapse installed and automatic updates enabled, you'll get the 1.5 version just by starting up Synapse - for Windows 7 & Server 2008 installations, read note below. Customers can download the full installation package from the customer area. A 30-day evaluation version can be downloaded here.

Very important upgrade information

-

Due to recent changes to the way Windows handles file and directory permissions, if you have User Account Control (UAC) activated the Synapse automatic updates will fail. This is because it no longer has permissions to write to its own plugin directory. In order for the upgrade to work you need to right click on the Synapse icon and select "Run as administrator".

You will only need to do this once as the 1.5 version is compatible with the newer Windows permission policy.

30May/100

News, updates and plans for 2010

image

It’s been a while since we published anything here on the blog so here is an update on what’s happening at Peltarion and what the near future looks like.

Two new products

We are currently working hard on the development of two new products. At this point the only things that we can reveal is that the products won’t replace Synapse, and that one is planned for a 2011 and the other for 2012.

Synapse 1.4

The next version of Synapse (1.4) is currently under development. It is feature complete but we have some work left to make it ready for release. It will among other things include multi-core support for optimizers – something that required a complete rewrite of the batch processing system. Synapse 1.4 is planned for release Q2/Q3 2010.

Industry collaboration

Finally, an exciting bit of news is that we have begun a collaboration with Intel. We are not allowed to say anything about it yet, but it involves a combination of their next generation technology and our next generation technology. There will be some seriously cool stuff there.

--The Peltarion Team

25Jul/093

Synapse 1.3.6 out

We have released Synapse 1.3.6 which contains a collection of bug fixes. The primary purpose of the release is to consolidate the individual component updates that have been released since 1.3.5 and to fix a common installation problem that people have been running into lately. As releases go this is a minor one.

Among the components that have been patched are:


An important issue with the install of Synapse has now hopefully been fixed. When starting Synapse the first time on a Vista or Windows 7 machine and with User Account Control (UAC) running, trial version users have been getting "License System Corrupted" messages from Synapse. This was due to UAC blocking a part of the install of the system that occurs when you start Synapse for the first time. The workaround was to start Synapse with administrative privileges, but many missed the workaround and could not subsequently get the software running. In 1.3.6 when you start Synapse, it will check if it has enough permissions to do the install, and if not to start a new process with administrative rights (the user gets a prompt). Hopefully this will be the end of that particular issue.

We had hoped to be finished with the multicore support for optimizers before the summer, but it will have to wait for the next release. We have a number of new components and new features in the pipeline so the next release will most likely be 1.4.0 - a major one.
Have a nice summer!
-The Peltarion Team
Filed under: Synapse 3 Comments
8May/091

New offices

We have moved to new offices at Sveavägen 52 in Stockholm:

Map picture

    image 

A few pictures (we’re still decorating):

clip_image002clip_image002[4]

clip_image002[6]clip_image002[8]

clip_image002[10]clip_image002[12]

clip_image002[14]clip_image002[16]

clip_image002[18]clip_image002[20]

imageclip_image002[22]

Filed under: General 1 Comment
31Dec/080

New Year – New Updates

First, let us at Peltarion thank all of our customers for making 2008 a great year. The number of Synapse users nearly doubled and it is always great pleasure to see how people use the software to solve problems in a way we couldn't have imagined. We feel truly privileged to be able to help provide solutions across so many interesting domains. We hope that 2009 will be even better and promise to continue improving Synapse making sure it stays the best development environment for adaptive systems.

Now, for the updates: We've released Synapse 1.3.5. It's mostly a collection of under-the-hood changes that improve performance and usability:

  • Improved GUI performance
  • Multi-threading improvements
  • Full 64-bit support
  • Extended validation data set support
  • New functions in the sensitivity analyzer and probe postprocessor.
  • Algorithmic improvements in control systems and learning rules
  • Faster data processing with SQL and CSV formats

A number of bug fixes are included as well.

If you have Synapse installed and automatic updates enabled, you'll get the latest version by just starting up Synapse. Customers can as always download the full installation package from the customer area. A 30-day evaluation version can be downloaded here.

We are already working on the next release which will include fully multi-threaded optimizers, new powerful adaptive systems blocks and new advanced learning algorithms.

Happy New Year! / The Peltarion Team

Filed under: Synapse No Comments
10Sep/083

Peltarion Forums Online!

The Peltarion Forums are now finally online. They are provided as a courtesy first to the users of Peltarion software and second to people interested in adaptive systems. Since Synapse was first released in 2006 we have had many requests for community forums to be set up. It took until now because most of the Peltarion staff felt that there were higher priorities than to spend a large amount of time moderating and administering the forums. An introduction of community forums was however always on the agenda.

This can be considered a test run of the forums. Should they prove popular and work well, we'll keep them. Our goal is that they in the end will be 100% community managed. For now however, moderation will be done by Peltarion staff until and if the forum participation grows to be large enough to pick moderators from the community. This goes for the wiki based general documentation system as well. Right now the documentation is read only for all but Peltarion staff, but should the community grow large enough, it will be opened up for editing by trusted members of the community.

We hope you will find the forums useful and that it will connect you to other people with similar interests in adaptive systems. We know from the support email we get that there are a lot of people working on similar problems. Hopefully it will be the place where they can exchange experiences and ideas.

Visit the Peltarion Forums

--The Peltarion Team

Filed under: General 3 Comments
5Sep/080

Synapse 1.3.1 updates

 

 

We have released Synapse 1.3.1 which contains a number of improvements.

The most visible change is to the Sensitivity Analyzer postprocessor which now has a new interface and functions for compensating for internal correlation between features.

There are two major groups of fixes that have been made. One concerns how validation sets are handled by control systems and filters. Only having one or more validation sets could cause inconsistent behaviour in some  filters and that has been fixed.

The second group concerns data loading with major improvements in the SQL format and the CSV file format. In both cases threaded data loading speeds have been vastly improved. Also support for accessing a Synapse solution offline (i.e. when the data file/database connection is can't be accessed) has been added. The data unit input manager has been improved as well to better support threaded loading and offline mode.

Finally, on an unrelated topic, the Peltarion website now fully supports the Google Chrome browser.

--The Peltarion Team

2Jul/082

Summer of Synapse

Summer is here and Peltarion would like for you to enjoy it as much as we do. Therefore we are currently offering a limited number of Synapse Licenses at half price during July and August! The first licenses each month are automatically sold at half price. There is only a limited number though, so act while there is still time.

Are there discounted licenses left for this month? Visit our web shop and see for yourself!

Update: All discounted licenses have now been sold out. Congratulations to everyone that got a discounted license!

Filed under: General, Synapse 2 Comments
30May/081

Synapse 1.3 released!

Synapse 1.3 has been released and is available for download. This release is a big one with new features and improvements in over 50 components. Although there are hundreds of improvements in the new version there are three major features that you will notice right away:

  • Integrated help system. On all blocks and filters you can now find a "Help" item in the settings browser. Clicking on it will take you to the relevant documentation in our new documentation system.
  • Script Filter and Script Format components with an advanced Visual Studio-like code editor that allows you to write your own filters and formats directly from Synapse:
  • LSTMs (Long Short-Term Memories) are advanced memory structures for use in dynamic adaptive systems. Unlike standard feedback loops LSTMs can preserve information over indefinite time gaps. With LSTMs previously unsolvable time-series problems can now be handled with ease.

If you have Synapse installed and automatic updates enabled, you'll get the 1.3 version just by starting up Synapse. Customers can download the full installation package from the customer area. A 30-day evaluation version can be downloaded here.

Filed under: General, Synapse, Updates 1 Comment
29May/080

Server upgrades

We are currently in the process of upgrading our server hardware, software as well as launching a new version of the home page and a new version of Synapse. This will cause interruptions in our service for the next 24-48 hours. We apologize for the inconvenience but we are sure you will like the improvements.

The new documentation system (http://www.peltarion.com/doc) is online and you can reach it if your ISP's DNS entries have been updated (if not, they will within the next 24 hours). The forums (http://www.peltarion.com/forums) will remain offline for a short while until the upgrades have settled.

A new version of Synapse that features massive improvements will be released tomorrow as both download and automatic update. More on that tomorrow.

We apologize for any inconveniences during this upgrade and hope you will enjoy the new site and the new Synapse 1.3.

--The Peltarion Team

6May/081

Deployed systems on Mono?

 

 

First, sorry for the lack of updates on the blog. We've been very busy working on a very large update package for Synapse as well as a new documentation system. We are also migrating to new servers, which does take some time. The good news is that we are seeing the light at the end of the tunnel and hope to have everything finished soon and ready for release. You'll be happy with the new documentation system that is a vast improvement over the existing stuff and it also features a community forum which I'm sure is good news as many have requested it.

Now to the point of this post which is a question that we get relatively often. The question is do systems deployed from Synapse work on mono? (Mono is a .NET framework clone that runs on Linux, Solaris, Mac OS X and a few other platforms.)

The short answer is: No.
The long answer is: Not yet.

The primary problem is a bug in Mono that causes it not to recognize .NET Compact assemblies. Base level components in Synapse are all Compact compatible (so you can use a deployed system on a PocketPC or any Windows Mobile phone and a range of other devices). 

We will look into the mono problem and try to come up with a solution. Until then, if you are using Linux (or FreeBSD or OS X etc),  you can always use the WINE emulator to run systems. In that case you can probably use Synapse as well. It is of course completely unsupported on our part, but it may be worth a try.

Filed under: Synapse, Technical 1 Comment
1Jan/080

Merry Updates and a Happy New Year

 

 

What better way to start the New Year but with a load of Synapse updates? The updates contain over 100 bug fixes, improvements and new features. Among the changes are:

  • Wider file format support for CSV and SQL formats
  • Improved Genetic and Swarm optimizer performance and stability
  • Added loading of existing samples/manual entry to the probe component
  • Improved control of Hebbian learning
  • Threaded operation for loading of Synapse as well as large Synapse solutions
  • Improved performance for several input formats

..to name just a few things.

If you have Synapse installed and automatic updates enabled, you'll get the latest version by just starting up Synapse. Customers can as always download the full installation package from the customer area. A 30-day evaluation version can be downloaded here.

Happy New Year! / The Peltarion Team

Filed under: Synapse No Comments