Return to All Questions When they say "football" do they mean soccer or American football? Posted by Colin Boone 9 years 5 months ago Answers They mean America soccer :) By Mallory Meiser on January 5, 2016 Log in or register to to answer questions Log in or register to to answer questions
They mean America soccer :)