Beau Legeer gives a great screencast of some of the new features in IDL 8.4. This is a quick way to see some of the new features in action.

## Basics

IDL adds some functional programming syntax in 8.4:

• lambda functions/procedures (inline routines)
• filter, map, and reduce methods

Why is functional programming so good? With side-effect free functions, the order of computation is not important, making it easier to distribute computations over multiple cores. MapReduce is an increasingly popular programming model for dealing with large data in a distributed environment which uses this idea. While these additions to IDL don’t make any of this directly possible yet, they do provide some of the necessary background. And, of source, some just are happier in a functional programming environment.

Continue reading “IDL 8.4: functional programming.”

IDL 8.4 adds a new routine, CODE_COVERAGE (docs), which returns information about the lines of a routine that have been executed. Using CODE_COVERAGE is fairly straight-forward — you do not need to enable code coverage. Just call CODE_COVERAGE at any time to find the lines of a routine that have been executed. Note that the routine must have been at least compiled before you call CODE_COVERAGE (even if you are clearing the status of the routine). Also, pay particular definition of the definition of a “line of code” in the docs, e.g., empty lines, comments, and END statements do not count. Between the return value and the output from the EXECUTED keyword, you should get all the lines of code in a routine.

CODE_COVERAGE adds another useful developer tool to the timing routines like PROFILER1, TIC and TOC. I think CODE_COVERAGE has a range of uses, but most interesting for me is the ability to determine the coverage of your unit test suite, i.e., how much of my code base is executed by my test suite?

I have already implemented some basic test coverage information in my unit testing framework, mgunit. For example, mgunit can now tell me that I’m missing coverage of a few lines in the helper routines for MG_SUBS:

"mg_subs_ut" test case starting (5 tests)
test_basic: passed (0.000223 seconds)
test_derived: passed (0.000354 seconds)
test_derived2: passed (0.000369 seconds)
test_derived_perverse: passed (0.000477 seconds)
test_not_found: passed (0.000222 seconds)
Test coverage: 90.5%
Untested lines
mg_subs_iter: lines 135
mg_subs_getvalue: lines 72-73, 79
Completely covered routines
mg_subs
Results: 5 / 5 tests passed, 0 skipped


This means that after the unit tests have been run, line 135 from MG_SUBS_ITER and lines 72-73, 79 from MG_SUBS_GETVALUE have not been executed. This is useful (though not complete) information for determining if you have enough unit tests. Grab mgunit from the master branch on GitHub to give it a try (see mglib for an example of unit tests that take advantage of it). I’m not sure of the exact format for displaying the results, but I am fairly certain of the mechanism for telling the unit tests which routines it is testing (an ::addTestingRoutine method). I intend to start using this for the unit tests of my products GPULib and FastDL soon!

1. There is also a CODE_COVERAGE keyword to PROFILER now that displays the number of lines of a routine that were executed.

IDL 8.4 was released today with a slew of new features. Check out What’s New in IDL 8.4 for a list of the new features.

Most interesting feature for me: code coverage. I am going to explore this a bit to see if I can get mgunit to report what code has been tested (and, more importantly, not tested) by your test suite.

Stay tuned for more detailed information about the new features!

Paulo Penteado has updated his Building cross-platform IDL runtime applications article with an über-installation for IDL 8.3 on all current platforms:

I created a package for IDL 8.3. It contains all the files in the current manifest_rt.txt file, which cover all the current platforms: Linux (x86_64), Windows (x86 and x86_64), Mac (x86_64) and Solaris (x86_64 and sparc64).

The über-installation (download) allows MAKE_RT to make all-inclusive IDL runtime applications that work on all platforms.

Greg Wilson gave a great talk about Software Carpentry at SciPy this year. I think more efforts like the Software Carpentry seminars are greatly needed in science — I’ve mentioned Software Carpentry several times before.

If you are interested in teaching, he highly recommends the book How Learning Works. It gives a summary of the current research in learning with links to the primary sources. I wish I had that when I was teaching.

The National Geographic has created new maps showing the extent of floating plastic in the ocean:

Tens of thousands of tons of plastic garbage float on the surface waters in the world’s oceans, according to researchers who mapped giant accumulation zones of trash in all five subtropical ocean gyres. Ocean currents act as “conveyor belts,” researchers say, carrying debris into massive convergence zones that are estimated to contain millions of plastic items per square kilometer in their inner cores.

Two ships covered in the world in nine months to collect this data.

via FlowingData

When writing even small applications, it is often necessary to distribute resource files along with your code. For example, images and icons are frequently needed by GUI applications. Custom color table files or fonts might be needed by applications that create visualizations. Defaults might be stored in other data files. But how do you find these files, when the user could have installed your application anywhere on their system?

ExelisVIS annonuced VISualize 2014 will focus on the following topics:

Presentations and discussions will focus on topics such as:

• Using new data platforms such as UAS, microsatellites, and SAR sensors
• Remote sensing solutions for precision agriculture
• Drought, flood, and extreme precipitation event monitoring and assessment
• Wildfire and conservation area monitoring, management, mitigation, and planning
• Monitoring leaks from natural gas pipelines

See the video for more information and then register or submit an abstract.

UPDATE 9/18/14: postponed until 2015.

I’ve been dealing with HDF 5 files for quite awhile, but IDL interface was as painful as the C interface. It did have H5_BROWSER and H5_PARSE to make things a bit easier, but these utilities are relevant for interactive browsing of a dataset and not for efficient, programmatic access. I created a set of routines for dealing with HDF 5 files that I have been extending as needed to other scientific data formats such as netCDF, HDF 4, and IDL Savefiles.

Continue reading “Scientific data file format routines.”

• #### GPULib

GPULib enables IDL developers to access the high-performance capabilities of modern NVIDIA graphics cards without knowledge of CUDA programming.

TaskDL is a task-farming solution for IDL designed for problems with loosely-coupled, parallel applications where no communication between nodes of a cluster is required.

#### mpiDL

mpiDL is a library of IDL bindings for Message Passing Interface (MPI) used for tightly-coupled parallel applications.

#### Remote Data Toolkit

The Remote Data Toolkit is a library of IDL routines allowing for easy access to various scientific data in formats such as OPeNDAP, HDF 5, and netCDF.

• #### Modern IDL

Modern IDL offers IDL programmers one place to look, for beginners and advanced users alike. This book also contains: a thorough tutorial on the core topics of IDL; a comprehensive introduction to the object graphics system; common problems and gotchas with many examples; advanced topics not normally found are discussed throughout the book: regular expressions, object graphics, advanced widget programming, performance, object-oriented programming, etc.

• #### IDLdoc

IDLdoc is an open source utility for generating documentation from IDL source code and specially formatted comments.

#### mgunit

mgunit is an open source unit testing framework for IDL.

#### rIDL

rIDL is an open source IDL command line replacement.

#### mglib

mglib is an open source library of IDL routines in areas of visualization, application development, command line utilities, analysis, data access, etc.