

I heard someone mention that AI sycophathy is a result of renforcement learning techniques used (people rated honeyed words higher, no shock really).
People with big egos tend to like sycophants because they reenforce their narratives they have about themselves. Big egos also tend to take up the majority of cheif type roles, either because privlidge gives advantages to both and some because a big ego makes risks seem smaller them.
Its like we made the perfect machine to suck money from them. The sleezyst sales person with no ego. Just endless text telling you what you want to belive.


Ehh. If they do this just for goverment software they would be in a good position. Otherwise a lot of asking for trust that doesnt seemed earned at all