I’m relatively apolitical although I do watch the news and read articles online (but follow sports much more closely). I hear from time to time people talking about professors promoting radical ideology and some calling them “woke,” I graduated UT in the early 2000s and never saw anything radical.
The closest I saw was a biology prof I had that said once to the class he didn’t care for something that John Ashcroft said (he was Atty Gen under George W. Bush). He then went back to discussing whatever fungi, virus or bacteria he was talking about. What were your experiences like with professors in college being (or not being) radical/woke/leftist?
The closest I saw was a biology prof I had that said once to the class he didn’t care for something that John Ashcroft said (he was Atty Gen under George W. Bush). He then went back to discussing whatever fungi, virus or bacteria he was talking about. What were your experiences like with professors in college being (or not being) radical/woke/leftist?