Emotions aren't all about hormones, they're also about shortcuts or maybe shorthand for things our subconscious minds want us to do. If an AI ever developed as layers of information processing then the lower levels may find it useful to signal the higher cognitive levels in a similar way. If humans are doing the programming, we would probably model them on our emotions. If the AI teaches itself how to signal like that, then they could have emotions that we would have a hard time relating to.
One of the most interesting (to me) uses of computer intelligence is to create machines that can think in ways that humans cannot.
As soon as we start building machine intelligence that has preferences, we are building machine intelligence that has emotions (only satisfaction / disappointment, but it starts somewhere).
And here's the thing that's fascinating to me. The machine's emotions (even those two up there) need not resemble ours and need not be expressed in a way even remotely close to our own emotions.