Abstract: Modern deep networks often rely on attention modules which are still at a modest level due to using either one type of channel-wise patterns or an expensive combination of two types of them.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results