We use MATLAB and the Psychophysics Toolbox to control all our experiments, including studies involving concurrent electrophysiology and eye-movement monitoring (Eastman & Huk, 2012). Handling the data acquisition, input/output, and co-registration of all these elements is supported by a specialized piece of hardware provided by vPixx. These disparate elements are integrated via a set of MATLAB functions that we call PLDAPS (as in, “platypus”). The acronym reflects our original goal of an integrated PLexon-DAtapixx-PSychtoolbox system with a single Mac controlling all experimental logic and video display, but the tools are easily adapted to almost any modern electrophysiology acquisition system. Reliance on the vPixx dataPixx video-I/O device is more central, and we assume allegiance to Psychtoobox (Brainard, 1997; Pelli, 1997).
PLDAPS is not a “killer app” that generates stereotyped experiments; rather, it is a set of basic tools that work within a state machine framework. It is a very flexible set of MATLAB-based code components that allow for precise interaction with both video displays and virtually any physical hardware. This also means that each user will need to invest in adapting these tools to their own particular use. Think of it more as an addition to Psychtoolbox than a stand-alone stimulus presentation or experimental control application.
Perhaps the most novel feature of PLDAPS is that it allows for the experimenter to see a copy of the experimental (subject) display, but additional “semantics” can be overlaid for (only) the experimenter’s view– details like the fixation/saccade windows, the subject’s realtime gaze position, which choice target is the correct one, whatever you want (and put into your code) to view online. This works by using slightly different color look-up tables for the two displays, and which in turn has 2 nice properties. First, the experimenter gets to see the “real” stimulus (with overlaid semantics), as opposed to a barebones or otherwise iconic representation of the experiment in one window, and a “raw” copy of the subject’s display on a separate screen. Second, this separation is handled using different colors (and careful layering of animation steps) which means a single set of code creates both views, which makes almost any conceivable experiment possible on a modern computer in terms of speed of processing, and which also avoids possible disconnects between the experimental control and the actual displays presented to the subject.
We are fond of this set of tools and are happy to distribute the code for others to use. Some or all of PLDAPS is in use in a number of labs around the world. Please note that we enjoy sharing the code and discussing its use, but we do not have a full time staff member dedicated to support, and so most happy users are already quite comfortable with MATLAB, PsychToolbox, and the basics of hardware-software interfacing. At the current time, usage and support instructions are as follows: (1) download the code from github; (2) figure it out and use it. Optionally: (3) report issues and make contributions via github.
Current versions of PLDAPS can run (a) without a dataPixx/vPixx device (although that will of course limit various I/O operations, and requires alternative ways to implement some of the nicest features), (b) without an electrophysiology system attached; (c) without an eyetracker. This means you can try out the code on any basic computer running MATLAB, and gradually weave in the hardware.
Also note that we have developed and tested PLDAPS only on the Mac OS; although there are no principled reasons why it shouldn’t work on Windows or Linux, we do not support Windows at all, and are just starting to set up some Linux-based rigs (we are happy to have received reports of success on the latter front).
The code is currently on GitHub, (https://github.com/huklab/PLDAPS), which now includes a tutorial and some basic sample experiments to start your own coding from. Feel free to start out by contacting Alex to discuss your particular goals and the latest coding developments. Note that the github archive is public, and please start with the default branch (“casual”).
If you use PLDAPS and wish to reference it (and we wish you wish to), please cite:
Eastman, K.M. & Huk, A.C. (2012). PLDAPS: A hardware architecture and software toolbox for neurophysiology requiring complex visual stimuli and online behavioral control. Frontiers in Neuroinformatics, 6:1. doi: 10.3389/fninf.2012.00001 [pdf]
3D printing designs
We use a standard 3D printer to make assorted small useful items for the lab, and to create prototypes for CNC milling. A few of our lab widgets (e.g., angled recording chamber adapters, omnetics connector holders, U-probe cases) might be of use elsewhere, so we post them on our Thingiverse account here: Huk Lab Thingiverse