You may have previously read my blog on how to load sflow data into Elasticsearch using a bash script but another way if you wish to consume sflow data into Elasticsearch there is a great logstash sflow codec available on GitHub.
Once you have followed the instructions and installed it simply edit your logstash config file input section as below. In this example logstash will listen on udp port 6343 for sflow information. You will then need to configure your network devices to send sflow data to the IP address of the host running logstash and in the case below udp port 6343.
As usual you would want to specify the Elasticsearch index the data is imported into.
If the Elasticsearch index exists and you restart logstash for the changes you made above to come into effect you should begin to see the sflow data appear.
You may wish to mutate some of the fields to match your existing fields.
For example below we change the default src_ip field to import into the Elasticsearch index as source_ip and dst_ip as destination_ip to match the existing field format used by other Elasticsearch data contributors.