Some tests... [🇷 for BE/BA]

posted by PharmCat  – Russia, 2020-08-10 13:48 (1347 d 23:25 ago) – Posting: # 21849
Views: 13,184

Hi ElMaestro,

❝ now it gets really interesting.


I made some tests. And results is surprising.

First code:
using BlockDiagonals
using LinearAlgebra
using BenchmarkTools
Am = rand(150, 150)
Bm = rand(150, 150)
@rput Am
@rput Bm
blocks  = [rand(3,3) for i in 1:50]
BDm     = BlockDiagonal(blocks)
V       = Matrix(BDm)
@rput V
b1 = @benchmark R"solve(V)"
b2 = @benchmark R"solve(Bm)"
b3 = @benchmark inv(Bm)


b1 1.109ms VS b2 3.751ms , so R solve block-diagonal matrix more than twice faster.
In Julia b3 2.707 ms: random matrix solving in Julia faster than in R, but more slower then b1 when matrix is block-diagonal.

Second code:

using RCall
using BlockDiagonals
using LinearAlgebra
using BenchmarkTools
blocks  = [rand(3,3) for i in 1:50]
BDm     = BlockDiagonal(blocks)
V       = Matrix(BDm)
@rput V
@rput blocks
b1 = @benchmark inv(V)
b2 = @benchmark inv.(blocks)
b3 = @benchmark inv(BDm)
b4 = @benchmark R"solve(V)"
b5 = @benchmark R"for (i in 1:50) {solve(blocks[[i]])}"


Solving by blocks in R (b5 3.275 ms) really slower (a little more faster than solving random matrix).

But solving block-diagonal matrix in Julia when structure is described takes 56.484 μs (b3 in second code), this is 20 times faster than in R.

I'm trying to discuss this in Julia forum here.

I'am not so expirienced in R, but I think that Solve.block can increase performance for block-diagonal matrices, docs here.

Complete thread:

UA Flag
Activity
 Admin contact
22,987 posts in 4,824 threads, 1,662 registered users;
86 visitors (0 registered, 86 guests [including 5 identified bots]).
Forum time: 13:13 CEST (Europe/Vienna)

The only way to comprehend what mathematicians mean by Infinity
is to contemplate the extent of human stupidity.    Voltaire

The Bioequivalence and Bioavailability Forum is hosted by
BEBAC Ing. Helmut Schütz
HTML5