There is a metadata protocol called opengraph, it’s how apps get the information to display a rich preview. Basically the app takes the link as it’s written in the SMS message or Twitter thread and then it tries to fetch that page and then read the open graph metadata from inside. That should give it enough to show a title description and a background image, considering the web developers implemented opengraph.
If Google is planning to use their own servers basically as a proxy then all this means is that the opengraph rich metadata is going to be a little more stale than if the app just fetch the page and generated the rich metadata itself
There is a metadata protocol called opengraph, it’s how apps get the information to display a rich preview. Basically the app takes the link as it’s written in the SMS message or Twitter thread and then it tries to fetch that page and then read the open graph metadata from inside. That should give it enough to show a title description and a background image, considering the web developers implemented opengraph.
If Google is planning to use their own servers basically as a proxy then all this means is that the opengraph rich metadata is going to be a little more stale than if the app just fetch the page and generated the rich metadata itself