Julia: A jit compiled language that aims for high performance scientific computation.
It makes ambitious claims about being the fastest and best thing ever. The community process is problematic, however, and I prefer the proven method of using python and optimizing the performance sensitive code with one of the many tools to do that.
That said, the idea of a science-users-first JIT language is timely, and Julia is that. Python does have clunky legacy issues in the numeric code. Matlab is expensive and nasty for non-numerics. Lua has some good science libraries and could totally have filled this niche but for IMO largely sociological reasons has not acquired the hipness or critical mass of Julia.
And there are some things specific to Julia which are serious selling points, aside from the language-feature one-upmanship. For example, Laplacians.jl by Dan Spielman and co-workers is an advanced matrix factorisation toolkit. Julia also has tidy-looking autodiff, in the form of juliadiff.
There is a reasonable IDE called juno, built on atom. The aspirational ggplot clone is gadfly. There is jupyter integration through IJulia.
Dataframes are called, unsurprisingly, dataframes.jl, and are part of the juliastats organisation
Nonetheless, I’m sticking to python with occasional R. Between Tensorflow, scipy, and all the machine-learning hipster tools for concurrency etc, the huge community and massive supply of support tools makes up for the awkwardness. My algorithms are obscure enough without having them implemented a somewhat obscure language.