Comments (3)
Hi, thanks for reporting the issue. I did some experiment, it is caused by the meta data setting to the llvm fmul intrinsic, if I changed it from "fpexpect.strict" to "fpexpect.ignore", the result of llvm-jit/aot is the same as the result of fast-jit:
diff --git a/core/iwasm/compilation/aot_llvm.c b/core/iwasm/compilation/aot_llvm.c
index 3af56e8b..ead8a203 100644
--- a/core/iwasm/compilation/aot_llvm.c
+++ b/core/iwasm/compilation/aot_llvm.c
@@ -2504,7 +2504,7 @@ aot_create_comp_context(const AOTCompData *comp_data, aot_comp_option_t option)
char *cpu = NULL, *features, buf[128];
char *triple_norm_new = NULL, *cpu_new = NULL;
char *err = NULL, *fp_round = "round.tonearest",
- *fp_exce = "fpexcept.strict";
+ *fp_exce = "fpexcept.ignore";
char triple_buf[128] = { 0 }, features_buf[128] = { 0 };
uint32 opt_level, size_level, i;
LLVMCodeModel code_model;
It affects the result the second fmul in the second time calling of func 2:
(func (;2;) (type 0)
i32.const 0
i32.const 255
i32.store8
f64.const nan (;=nan;)
i32.const 0
f64.load
f64.const 0x0p+0 (;=0;)
f64.mul
f64.mul => The two inputs are: 7ff8000000000000 and 7ff80000000000ff,
when using fpexcept.strict, the mul result is: 7ff80000000000ff
when using fpexcept.ignore, the mul result is: 7ff8000000000000
from wasm-micro-runtime.
Thank you for your reply! But I still confused that what makes the different JIT modes to generate different binary sequence which cause different multiplication results? In addition, could you please explain how to pinpoint the buggy instructions if it is convenient. Thanks again!
from wasm-micro-runtime.
The LLVM-JIT leverages LLVM framework while FAST-JIT's framework is self-implemented in WAMR, their pipelines and codegens are different, so the result may be different, sometimes we have to check the LLVM IR and related attributes for it.
For pinpoint the buggy instructions
, normally I first compared the execution results of each wasm opcode between two running modes, to achieve that, you may refactor the wasm opcodes, e.g. comment out the opcodes after the opcode to check and change the function result type if needed, and let iwasm print the result of the opcode, and check whether results of two running modes are different.
Another possible method is to use wasm-interp of wabt to trace the execution result, e.g. /opt/wabt/bin/wasm-interp -t -r <func> <wasm file>
. And then use the AOT trace feature in this PR: #2647. But it is in experiment stage and we have no bandwidth to finish it yet.
And you can also dump the LLVM IR, e.g. wamrc --format=llvmir-unopt -o test.ll test.wasm
, or dump the object file wamrc --format=object -o test.o test.wasm
and then dump the machine code with objump -d test.o
.
from wasm-micro-runtime.
Related Issues (20)
- printf_wrapper (libc builtin) makes multiple calls to os_printf with individual characters HOT 6
- Can AOT file built on windows run directly on other platforms? HOT 5
- How to load and run another wasm in a running wasm? HOT 12
- Exception Handling Support in Fast Interpreter HOT 1
- iwasm with aot file fail with SIGSEGV HOT 4
- Exception for Assemblyscript HOT 1
- How to cross compile to riscv64 HOT 2
- Unexpected behavior on fast interpreter mode HOT 2
- A bug in validation HOT 2
- Report some typo in wamr.gitbook HOT 1
- Calling "step-over" on the last line of the function failed HOT 1
- CI test-wamr-ide failed to run HOT 1
- Request to add vmlib on Android platform to align with Linux platform HOT 2
- A bug in ckecking data count section HOT 1
- [wasi-nn] add more backends into `libraries/wasi-nn`
- Enabling WAMR_BUILD_DEBUG_INTERP causes the program to run slower
- Exception: out of bounds memory access HOT 3
- How to debug wasm on the Windows platform HOT 2
- Got Crash when call Intel dcap verification lib which link with wasm from java
- Failure to execute AOT code compiled to 32bit --target=i386 HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from wasm-micro-runtime.