Skip to content

Instantly share code, notes, and snippets.

View ChrisRackauckas's full-sized avatar
🎯
Focusing

Christopher Rackauckas ChrisRackauckas

🎯
Focusing
View GitHub Profile
f(x) = 2.0x + 3.0
g(x) = muladd(x,2.0, 3.0)
h(x) = fma(x,2.0, 3.0)
k(x) = @fastmath 2.0x + 3.0
@code_llvm f(4.0)
@code_llvm g(4.0)
@code_llvm h(4.0)
@code_llvm k(4.0)
julia> using DataFrames
INFO: Precompiling module DataFrames.
WARNING: `@vectorize_1arg` is deprecated in favor of compact broadcast syntax. Instead of `@vectorize_1arg`'ing function `f` and calling `f(arg)`, call `f.(arg)`.
in depwarn(::String, ::Symbol) at .\deprecated.jl:64
in @vectorize_1arg(::ANY, ::ANY) at .\deprecated.jl:986
in include_from_node1(::String) at .\loading.jl:532
in include(::String) at .\sysimg.jl:14
in include_from_node1(::String) at .\loading.jl:532
in include(::String) at .\sysimg.jl:14
in macro expansion; at .\none:2 [inlined]
function do_something(a::Float64, b::Float64)
println("In original do_something")
return a+b
end
function do_all(a::Float64, b::Float64)
result = 0.0
result += do_something(a, b)
result += custom_do_something(a, b)
return result
@ChrisRackauckas
ChrisRackauckas / xeonbenchresults.jl
Created September 26, 2016 22:31
Libm.jl Benchmark Results on Intel(R) Xeon(R) CPU E5-2667 v4 @ 3.20GHz
commit 00b056c1dd6e25af7c6486b656c0b2cd427269ca
julia> versioninfo()
Julia Version 0.6.0-dev.770
Commit e665592* (2016-09-25 12:40 UTC)
Platform Info:
System: Linux (x86_64-redhat-linux)
CPU: Intel(R) Xeon(R) CPU E5-2667 v4 @ 3.20GHz
WORD_SIZE: 64
BLAS: libopenblas (DYNAMIC_ARCH NO_AFFINITY Haswell)
commit 00b056c1dd6e25af7c6486b656c0b2cd427269ca
julia> versioninfo()
Julia Version 0.5.0
Commit 3c9d753 (2016-09-19 18:14 UTC)
Platform Info:
System: NT (x86_64-w64-mingw32)
CPU: AMD FX(tm)-8350 Eight-Core Processor
WORD_SIZE: 64
Julia Version 0.6.0-dev.734
Commit 413ed79 (2016-09-21 08:29 UTC)
Platform Info:
System: NT (x86_64-w64-mingw32)
CPU: Intel(R) Core(TM) i7-4770K CPU @ 3.50GHz
WORD_SIZE: 64
BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
LAPACK: libopenblas64_
LIBM: libopenlibm
LLVM: libLLVM-3.7.1 (ORCJIT, haswell)
julia> versioninfo()
Julia Version 0.6.0-dev.770
Commit e665592* (2016-09-25 12:40 UTC)
Platform Info:
System: Linux (x86_64-redhat-linux)
CPU: Intel(R) Xeon(R) CPU E5-2667 v4 @ 3.20GHz
WORD_SIZE: 64
BLAS: libopenblas (DYNAMIC_ARCH NO_AFFINITY Haswell)
LAPACK: libopenblasp.so.0
LIBM: libopenlibm
Requirement already satisfied (use --upgrade to upgrade): numpy>=1.6 in c:\users\chris\.julia\v0.6\conda\deps\usr\lib\site-packages (from pydstool)
Building wheels for collected packages: pydstool, scipy
Running setup.py bdist_wheel for pydstool ... -
\
|
/
-
\
|
/
C:\Users\Chris\AppData\Local\Julia-0.5.0\bin>gdb --args julia-debug test_script.jl
GNU gdb (GDB) 7.10.1
Copyright (C) 2015 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law. Type "show copying"
and "show warranty" for details.
This GDB was configured as "x86_64-w64-mingw32".
Type "show configuration" for configuration details.
For bug reporting instructions, please see:

This is a response to @akis on the Julia Discourse forum. I don't want to clutter the forum, there there are some things you have said in the last few days which are objectively false and so I feel like I have a duty to the community to make sure they are corrected. Please do not take this as an attack on you or your character, I only want to ensure people can read correct information.

Let me start with the one I find most amusing. I am sorry for writing this in a sarcastic manner, but I felt like it was too good of an opportunity to pass up. Yesterday, I stated here that:

I am pretty sure that a pinned post and mod intervention can easily make a big community switch to a new active forum in 14 days. Just a gut feeling.

You responded with:

>Thanks for clearing up that your objection is based either on "gut feeling" or rumors or lack of much care for anything beyond personal convenience. My own experience and research on th