It seems it can, and with up to 91% accuracy, if you were to believe Stanford University researchers Yilun Wang and Michal Kosinski, people who developed an AI that used deep neural networks to extract features from around 35k facial images and classified them by sexual orientation.

Compared to human judges of facial expressions to determine whether the person was straight or gay, the program did a lot better. While human judges got it right 61% in case of men and 54% in case of women, the software results were 91% and 83% accordingly.

Wang and Kosinski published their study in The Journal of Personality and Social Psychology, where they said that "Gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles. Gay men should tend to have more feminine facial features than heterosexual men - smaller jaws and chins, slimmer eyebrows, longer noses, and larger foreheads. Lesbians tended to use less eye makeup, had darker hair, and wore less revealing clothes (note the higher neckline) - indicating less feminine grooming and style. Furthermore, although women tend to smile more in general, lesbians smiled less than their heterosexual counterparts."

Currently the program is only able to look at white people, also bisexuals and transgender people were excluded for now.

While the results of their study are certainly interesting, they say that they didn't really expect to get such results, and that the results of their work could be considered dangerous (especially in countries where being gay is still a crime in 2017). They say that "Our findings expose a threat to the privacy and safety of gay men and women. We were really disturbed by these results and spent much time considering whether they should be made public at all. We did not want to enable the very risks that we are warning against."