cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
bromanous
Adventurer
Adventurer
964 Views
Registered: ‎04-14-2011

Alveo 280 - Can't allocate large buffer on a DDR bank

Jump to solution

Hello,

I have Alveo U280 and I am using Vitis 2019.2 with XRT: xilinx_u280_xdma_201920_3 on Centos 7.6

The board's shell is up to date (xilinx_u280_xdma_201920_3,[ID=0x5e278820],[SC=4.3.10])  and passes the validation test.

U280 has two DDR banks, each is 16 GB. but some reason, I can't allocate more than approximately 3.9GB buffer on a DDR bank. If I try to allocate a buffer of size 4GB or more I get an error such as:

XRT build version: 2.5.309

Build hash: 9a03790c11f066a5597b133db737cf4683ad84c8

Build date: 2020-02-24 02:54:37

Git branch: 2019.2_PU2

PID: 125620

UID: 1006

[Sun Apr 19 17:21:52 2020]

HOST: **********

EXE: /home/****/Vadd_HLS/Vaddr_HLS

[XRT] ERROR: std::bad_alloc

../src/host.cpp:84 Error calling cl::Buffer buffer_in1 (context,CL_MEM_USE_HOST_PTR | CL_MEM_READ_ONLY, vector_size_bytes, source_in1.data(), &err), error code is: -6

Here is my configuration file:

[connectivity]
sp=vadd_1.in1:DDR[0]
sp=vadd_1.in2:DDR[0]
sp=vadd_1.out:DDR[0]

This particular error came from the example on Xilinx tutorial Github (I am using this one because it is reproduceable):

https://github.com/Xilinx/Vitis-Tutorials/blob/master/docs/mult-ddr-banks/README.md

However, I tried the same with RTL kernel of (A[i] = A[i] + 1) and connectivity of sp=dram_0_1.m00_axi:DDR[0] 

cl_mem_ext_ptr_t mem_ext;
mem_ext.obj = 0;
mem_ext.param = 0;

mem_ext.flags = 32|XCL_MEM_TOPOLOGY;
d_A = clCreateBuffer(context, CL_MEM_READ_WRITE | CL_MEM_EXT_PTR_XILINX, number_of_bytes, &mem_ext, &err);

I also get an error with code -6 

I think that -6 comes from line 143 in "cl/h" which is 

#define CL_OUT_OF_HOST_MEMORY -6

I changed the pointers type from 32-bits (unsigned int or cl_uint) to 64-bits (unsigned long int or cl_ulong)

My host server has 128 GB of memory and I can see that the process has plenty of memory to use.The Emulation-HW process uses much more than the Hardware process without an error.

When I type "xbutil top -i 1" I can see the DDR[0] getting assigned with 16 GB then the process dies and I get the [XRT] ERROR: std::bad_alloc message!

I don't get any errors when using the Emulation-HW. In that mode I can even allocate a 16 GB buffer.

 

Can someone please help e figure out what am I doing wrong?

Thank you very much

Best Regards,
B.Romanous
Tags (3)
0 Kudos
1 Solution

Accepted Solutions
bromanous
Adventurer
Adventurer
841 Views
Registered: ‎04-14-2011

I thought I would update the post with the answer in case someone reads this post later.

"IMPORTANT: A single buffer cannot be bigger than 4 GB, yet to maximize throughput from the host to global memory, Xilinx also recommends keeping the buffer size at least 2 MB if possible."

From:https://www.xilinx.com/html_docs/xilinx2019_2/vitis_doc/Chunk641954935.html#vpy1519742402284

Best Regards,
B.Romanous

View solution in original post

0 Kudos
1 Reply
bromanous
Adventurer
Adventurer
842 Views
Registered: ‎04-14-2011

I thought I would update the post with the answer in case someone reads this post later.

"IMPORTANT: A single buffer cannot be bigger than 4 GB, yet to maximize throughput from the host to global memory, Xilinx also recommends keeping the buffer size at least 2 MB if possible."

From:https://www.xilinx.com/html_docs/xilinx2019_2/vitis_doc/Chunk641954935.html#vpy1519742402284

Best Regards,
B.Romanous

View solution in original post

0 Kudos