To those who enjoy gardening, do you find it ethically questionable that many garden centers and local shops sell species considered invasive for your area? Do you make sure to only purchase native plants or has the thought never occurred to you? I have bought plants for various reasons without doing any research. I know better now but I just was wondering how many people on here have considered it.
DK/DC (though you should): Do you proudly wear bumper stickers or decals on your vehicle, and care to share a pic?