You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I would like to inquire about how to resolve the relatively low computational performance exhibited when calculating gradients with a function built using SymbolicUtils.jl. Below is a sample code for reference:
using SymbolicUtils
using SymbolicUtils.Code
using Symbolics
using BenchmarkTools
using Zygote
@variables a b c d
@variables p1 p2 p3
assign_list = [
Assignment(c, a * p1 + b * p2),
Assignment(d, c * p1 + b * p3),
]
flux_output_array = MakeArray([c, d], Vector)
func1 = Func([DestructuredArgs([a,b]), DestructuredArgs([p1, p2, p3])], [], Let(assign_list, flux_output_array, false))
test_func1 = eval(toexpr(func1))
test_func2(i,p) = begin
a = i[1]
b = i[2]
p1 = p[1]
p2 = p[2]
p3 = p[3]
c = a * p1 + b * p2
d = c * p1 + b * p3
[c, d]
end
For a single data computation, the performance of the function evaluation and the backward gradient computation is as follows:
The results show that test_func1, generated using SymbolicUtils.jl, is less efficient compared to test_func2, which is manually constructed. Particularly, in gradient computation with Zygote.jl, the gradient computation for test_func1 is less efficient.
In experiments with large datasets, I used a matrix for broadcasting computations and then calculated its gradient:
The results indicate that in large-scale data experiments, the gradient computation cost of test_func1 is significantly higher than that of test_func2.
Therefore, I would like to ask whether there is room for improvement in efficiency when using SymbolicUtils.jl to construct functions, and why functions constructed with SymbolicUtils.jl exhibit lower computational efficiency, especially in gradient computation.
The text was updated successfully, but these errors were encountered:
Hello, I would like to inquire about how to resolve the relatively low computational performance exhibited when calculating gradients with a function built using SymbolicUtils.jl. Below is a sample code for reference:
For a single data computation, the performance of the function evaluation and the backward gradient computation is as follows:
The results show that
test_func1
, generated using SymbolicUtils.jl, is less efficient compared totest_func2
, which is manually constructed. Particularly, in gradient computation with Zygote.jl, the gradient computation fortest_func1
is less efficient.In experiments with large datasets, I used a matrix for broadcasting computations and then calculated its gradient:
The results indicate that in large-scale data experiments, the gradient computation cost of
test_func1
is significantly higher than that oftest_func2
.Therefore, I would like to ask whether there is room for improvement in efficiency when using SymbolicUtils.jl to construct functions, and why functions constructed with SymbolicUtils.jl exhibit lower computational efficiency, especially in gradient computation.
The text was updated successfully, but these errors were encountered: