Try using count_big(*) instead-- this is defined as bigint instead of int.
In this example, I am looking for how many 100s of millions of rows exist by product type where each row is roughly 300 bytes wide to ultimately obtain how many gigabytes of data I need to provision.
count_big(*) as ProdCount,
(count_big(*)*300) as TotalBytes,
((count_big(*)*300)/1073741824) as Gigabytes
from fdwintegration.etl.fuelpriceindex x
inner join fdwintegration.etl.FuelProducts y
on x.ProductIndicator = y.ProductIndicator
and x.ProductType = y.ProductType
where x.ProductIndicator = 'D'
group by x.ProductType
order by x.ProductType