Statistical observable deviation is a clear mathematical concept at the core of analyzing these two type of issues. In numerical methods you consider finite sets, and there are infinitely many theoretical distributions (generated by algorithms) that don't deviate statistically at a level below detectable sample or representation error. Thats how you quantify relevance, it's applied statistics and probability theory of finite sample sets.
BM or any other method is like you said an theoretical concept that can't be implemented exactly numerically, it can only be approximated with a finite length number representation system and finite precision operations on those.
Floating point resolution issues have been discussed extensively, no? It can be addressed in the sense that you can refine your representation (more bits etc) and make certain properties provable deviate below some threshold. You do his without having to change the algorithms (zigurrat, BM don't need modifications), so it's and orthogonal issue.