Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Merging Outcomes of SAM Applied to RGB and Depth Images in Bin Picking Applications



Marek Franaszek, Prem Rachakonda, Pavel Piliptchak, Kamel S. Saidi


Segmenting images containing many objects stacked in unstructured piles is a challenging and vital task in robotic bin picking applications. Objects in such images are congested and occluded but, nevertheless, must be accurately segmented to calculate their 6DoF poses. For fast completion of automated tasks, many of these poses should be calculated and sent to the robot controller at once so that the path planning algorithm can prioritize which object to grasp. We show that the Segment Anything Model (SAM) can be used as the first step in processing such images to segment individual parts in a bin. However, applying SAM to red, green, blue (RGB) and depth images acquired from the same bin yields different results, with many segmentation masks present in only one type of image. Thus, merging two SAM outputs from both image types is suggested to maximize the number of segmented parts.
Proceedings Title
Sensors & Transducers
Conference Dates
April 17-19, 2024
Conference Location
Funchal, PT
Conference Title
6th International Conference on Advances in Signal Processing and Artificial Intelligence


Foundation Models, Segment Anything Model (SAM), RGB and Depth Image, Robotic Bin Picking


Franaszek, M. , Rachakonda, P. , Piliptchak, P. and Saidi, K. (2024), Merging Outcomes of SAM Applied to RGB and Depth Images in Bin Picking Applications, Sensors & Transducers, Funchal, PT, [online], (Accessed June 25, 2024)


If you have any questions about this publication or are having problems accessing it, please contact

Created April 19, 2024, Updated May 28, 2024