{"id":13507,"date":"2021-09-01T07:47:31","date_gmt":"2021-09-01T07:47:31","guid":{"rendered":"https:\/\/engineering.lusha.com\/?p=13507"},"modified":"2021-10-28T06:56:14","modified_gmt":"2021-10-28T06:56:14","slug":"upload-csv-from-large-data-table-to-s3-using-nodejs-stream","status":"publish","type":"post","link":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/","title":{"rendered":"Uploading CSV from Large Data Tables to S3 Using Node.js Stream"},"content":{"rendered":"<p><a href=\"https:\/\/www.lusha.com\/\" rel=\"follow\" target=\"_self\">Lusha<\/a> is a Hypergrowth company having grown exponentially in the past year in number of employees, customers, scale, traffic, and more.<\/p>\n<p>In the engineering department, we often face challenges dealing with the faster paced growth in scale, traffic, and data that is growing all the time. One such example occurred one major feature that uploads large CSV files to Amazon S3, stopped working because of performance and memory issues we encounter.<\/p>\n<p>In this blog post, I will describe the challenge we had with this feature that directly impacted Lusha&#8217;s customers.<\/p>\n<h2>The Problem<\/h2>\n<p>In Lusha <a href=\"https:\/\/dashboard.lusha.com\" rel=\"follow\" target=\"_self\">dashboard UI<\/a>, customers can add contacts to their personal saved list. These lists can potentially contain a large number of contacts. In the UI of Lusha, there is a button to export a CSV containing all contacts.<\/p>\n<p><img decoding=\"async\" class=\"alignnone size-full wp-image-13587\" src=\"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/CSV.png\" alt=\"\" width=\"996\" height=\"560\" srcset=\"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/CSV.png 996w, https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/CSV-300x169.png 300w, https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/CSV-768x432.png 768w\" sizes=\"(max-width: 996px) 100vw, 996px\" \/><\/p>\n<p>When the customer presses Export to CSV he is asked to type his email address and then receives an email with a download link to his CSV file containing all his contact information.<\/p>\n<p>Behind the scenes, we send <a href=\"https:\/\/www.rabbitmq.com\/\" rel=\"nofollow noopener\" target=\"_blank\">RabbitMQ<\/a> message containing all the UI filters the customer selected. On the backend, this process is hooked up to a server that receives the message, pulls the data from our database (we use Cassandra as our main contact database), generates the CSV file to disk, uploads the file to Amazon S3, and sends an email with a download link to the customer.<\/p>\n<p>Here&#8217;s an example of a code snippet on the backend server:<\/p>\n<p>Pay attention to <strong>line 17<\/strong>, where we add all results from the Cassandra query to a global variable named <strong>contacts<\/strong>.<\/p>\n<p><span style=\"font-weight: 400;\"><script src=\"https:\/\/gist.github.com\/f6b84993c23c3c65bc88011caf27f5d1.js\"><\/script><\/span><\/p>\n<p>The code works for pulling hundreds of results, but it&#8217;s not designed to run hundreds of thousands of results. When faced with this much data, the <a href=\"https:\/\/nodejs.org\/en\/\" rel=\"nofollow noopener\" target=\"_blank\">Node.js<\/a> process sends back an error message: <em><strong>JavaScript heap out of memory<\/strong><\/em>.<\/p>\n<h2>The solution<\/h2>\n<p><a href=\"https:\/\/nodejs.org\/api\/stream.html\" rel=\"nofollow noopener\" target=\"_blank\">Node.js stream<\/a> API to the rescue!<\/p>\n<ol>\n<li>Use Cassandra client driver <a href=\"https:\/\/docs.datastax.com\/en\/developer\/nodejs-driver\/3.3\/#row-streaming-and-pipes\" rel=\"nofollow noopener\" target=\"_blank\">stream API<\/a> in order to fetch the results (remove the use of the global variable that store entire results in memory),<\/li>\n<li>Create the CSV file using <a href=\"https:\/\/csv.js.org\/stringify\/\" rel=\"nofollow noopener\" target=\"_blank\">csv-stringify<\/a> that supports stream API,<\/li>\n<li>Use the <a href=\"https:\/\/nodejs.org\/en\/\" rel=\"nofollow noopener\" target=\"_blank\">Node.js<\/a> <a href=\"https:\/\/nodejs.org\/api\/stream.html#stream_class_stream_passthrough\" rel=\"nofollow noopener\" target=\"_blank\">stream.PassThrough<\/a> in order to upload the CSV file to AWS S3<\/li>\n<\/ol>\n<p>&nbsp;<\/p>\n<p>Let&#8217;s see the complete code that supports streaming flow:<\/p>\n<p><span style=\"font-weight: 400;\"><script src=\"https:\/\/gist.github.com\/867bdf86936d93137f2c770111a28ce8.js\"><\/script><\/span><\/p>\n<h2>In conclusion<\/h2>\n<p>In a hypergrowth company that deals with growing quantities of data and large-scale services, the code&#8217;s processing capacity becomes tricky and you need to think about memory, CPU and performance.<\/p>\n<p><a href=\"https:\/\/nodejs.org\/en\/\" rel=\"nofollow noopener\" target=\"_blank\">Node.js<\/a> stream is a powerful API. It can minimize memory consumption, improve performance and is very simple to use.<\/p>\n<p>I hope this will help you solve the next memory or performance issue you encounter at your company.<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Lusha is a Hypergrowth company having grown exponentially in the past year in number of employees, customers, scale, traffic, and more. In the engineering department, we often face challenges dealing with the faster paced growth in scale, traffic, and data that is growing all the time. One such example occurred one major feature that uploads [&hellip;]<\/p>\n","protected":false},"author":48,"featured_media":13588,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[109,110,130,118],"tags":[128,127,113,129,126],"class_list":["post-13507","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-nodejs","category-performance","category-stream","category-technology","tag-cassandra","tag-csv","tag-nodejs","tag-s3-upload","tag-stream"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Uploading CSV from Large Data Tables to S3 Using Node.js Stream - Lusha Engineering Blog<\/title>\n<meta name=\"description\" content=\"Lusha Engineering Blog - Node.js - In the engineering department, we often face challenges dealing with the faster paced growth in scale, traffic, and data that is growing all the time.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Uploading CSV from Large Data Tables to S3 Using Node.js Stream - Lusha Engineering Blog\" \/>\n<meta property=\"og:description\" content=\"Lusha Engineering Blog - Node.js - In the engineering department, we often face challenges dealing with the faster paced growth in scale, traffic, and data that is growing all the time.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/\" \/>\n<meta property=\"og:site_name\" content=\"Lusha Engineering Blog\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/LushaData\/\" \/>\n<meta property=\"article:published_time\" content=\"2021-09-01T07:47:31+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-10-28T06:56:14+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png\" \/>\n\t<meta property=\"og:image:width\" content=\"996\" \/>\n\t<meta property=\"og:image:height\" content=\"560\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Rotem Bloom\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@lushadata\" \/>\n<meta name=\"twitter:site\" content=\"@lushadata\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Rotem Bloom\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/\"},\"author\":{\"name\":\"Rotem Bloom\",\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/#\\\/schema\\\/person\\\/aa7547240f6d9c5640ddb5259e58572c\"},\"headline\":\"Uploading CSV from Large Data Tables to S3 Using Node.js Stream\",\"datePublished\":\"2021-09-01T07:47:31+00:00\",\"dateModified\":\"2021-10-28T06:56:14+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/\"},\"wordCount\":477,\"commentCount\":4,\"publisher\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/engineering.lusha.com\\\/wp-content\\\/uploads\\\/2021\\\/09\\\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png\",\"keywords\":[\"cassandra\",\"csv\",\"nodejs\",\"S3-upload\",\"stream\"],\"articleSection\":[\"Node.js\",\"Performance\",\"Stream\",\"Technology\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/\",\"url\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/\",\"name\":\"Uploading CSV from Large Data Tables to S3 Using Node.js Stream - Lusha Engineering Blog\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/engineering.lusha.com\\\/wp-content\\\/uploads\\\/2021\\\/09\\\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png\",\"datePublished\":\"2021-09-01T07:47:31+00:00\",\"dateModified\":\"2021-10-28T06:56:14+00:00\",\"description\":\"Lusha Engineering Blog - Node.js - In the engineering department, we often face challenges dealing with the faster paced growth in scale, traffic, and data that is growing all the time.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/#primaryimage\",\"url\":\"https:\\\/\\\/engineering.lusha.com\\\/wp-content\\\/uploads\\\/2021\\\/09\\\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png\",\"contentUrl\":\"https:\\\/\\\/engineering.lusha.com\\\/wp-content\\\/uploads\\\/2021\\\/09\\\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png\",\"width\":996,\"height\":560},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Uploading CSV from Large Data Tables to S3 Using Node.js Stream\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/#website\",\"url\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/\",\"name\":\"Lusha Engineering Blog\",\"description\":\"Search less. Sell more.\",\"publisher\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/#organization\",\"name\":\"Lusha\",\"url\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/engineering.lusha.com\\\/wp-content\\\/uploads\\\/2021\\\/07\\\/cropped-fav.png\",\"contentUrl\":\"https:\\\/\\\/engineering.lusha.com\\\/wp-content\\\/uploads\\\/2021\\\/07\\\/cropped-fav.png\",\"width\":512,\"height\":512,\"caption\":\"Lusha\"},\"image\":{\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/LushaData\\\/\",\"https:\\\/\\\/x.com\\\/lushadata\",\"https:\\\/\\\/www.instagram.com\\\/lifeatlusha\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/lushadata\",\"https:\\\/\\\/www.youtube.com\\\/channel\\\/UCZNqzgRZFBub6WJxeaQcxUQ\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/#\\\/schema\\\/person\\\/aa7547240f6d9c5640ddb5259e58572c\",\"name\":\"Rotem Bloom\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/79f7826d26247e8fa67dde875dd55e80973c4f14bd6d9d4d430d1886b3d4c433?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/79f7826d26247e8fa67dde875dd55e80973c4f14bd6d9d4d430d1886b3d4c433?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/79f7826d26247e8fa67dde875dd55e80973c4f14bd6d9d4d430d1886b3d4c433?s=96&d=mm&r=g\",\"caption\":\"Rotem Bloom\"},\"description\":\"First of all a software developer. mastering TDD and micro-service architecture.\",\"sameAs\":[\"https:\\\/\\\/www.lusha.com\\\/\",\"https:\\\/\\\/www.linkedin.com\\\/in\\\/rotembloom\\\/\"],\"url\":\"https:\\\/\\\/engineering.lusha.com\\\/blog\\\/author\\\/rotem\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Uploading CSV from Large Data Tables to S3 Using Node.js Stream - Lusha Engineering Blog","description":"Lusha Engineering Blog - Node.js - In the engineering department, we often face challenges dealing with the faster paced growth in scale, traffic, and data that is growing all the time.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/","og_locale":"en_US","og_type":"article","og_title":"Uploading CSV from Large Data Tables to S3 Using Node.js Stream - Lusha Engineering Blog","og_description":"Lusha Engineering Blog - Node.js - In the engineering department, we often face challenges dealing with the faster paced growth in scale, traffic, and data that is growing all the time.","og_url":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/","og_site_name":"Lusha Engineering Blog","article_publisher":"https:\/\/www.facebook.com\/LushaData\/","article_published_time":"2021-09-01T07:47:31+00:00","article_modified_time":"2021-10-28T06:56:14+00:00","og_image":[{"width":996,"height":560,"url":"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png","type":"image\/png"}],"author":"Rotem Bloom","twitter_card":"summary_large_image","twitter_creator":"@lushadata","twitter_site":"@lushadata","twitter_misc":{"Written by":"Rotem Bloom","Est. reading time":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/#article","isPartOf":{"@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/"},"author":{"name":"Rotem Bloom","@id":"https:\/\/engineering.lusha.com\/blog\/#\/schema\/person\/aa7547240f6d9c5640ddb5259e58572c"},"headline":"Uploading CSV from Large Data Tables to S3 Using Node.js Stream","datePublished":"2021-09-01T07:47:31+00:00","dateModified":"2021-10-28T06:56:14+00:00","mainEntityOfPage":{"@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/"},"wordCount":477,"commentCount":4,"publisher":{"@id":"https:\/\/engineering.lusha.com\/blog\/#organization"},"image":{"@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/#primaryimage"},"thumbnailUrl":"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png","keywords":["cassandra","csv","nodejs","S3-upload","stream"],"articleSection":["Node.js","Performance","Stream","Technology"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/","url":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/","name":"Uploading CSV from Large Data Tables to S3 Using Node.js Stream - Lusha Engineering Blog","isPartOf":{"@id":"https:\/\/engineering.lusha.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/#primaryimage"},"image":{"@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/#primaryimage"},"thumbnailUrl":"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png","datePublished":"2021-09-01T07:47:31+00:00","dateModified":"2021-10-28T06:56:14+00:00","description":"Lusha Engineering Blog - Node.js - In the engineering department, we often face challenges dealing with the faster paced growth in scale, traffic, and data that is growing all the time.","breadcrumb":{"@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/#primaryimage","url":"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png","contentUrl":"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/09\/Upload-CSV-from-large-data-table-to-S3-using-NodeJS-stream.png","width":996,"height":560},{"@type":"BreadcrumbList","@id":"https:\/\/engineering.lusha.com\/blog\/upload-csv-from-large-data-table-to-s3-using-nodejs-stream\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/engineering.lusha.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Uploading CSV from Large Data Tables to S3 Using Node.js Stream"}]},{"@type":"WebSite","@id":"https:\/\/engineering.lusha.com\/blog\/#website","url":"https:\/\/engineering.lusha.com\/blog\/","name":"Lusha Engineering Blog","description":"Search less. Sell more.","publisher":{"@id":"https:\/\/engineering.lusha.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/engineering.lusha.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/engineering.lusha.com\/blog\/#organization","name":"Lusha","url":"https:\/\/engineering.lusha.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/engineering.lusha.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/07\/cropped-fav.png","contentUrl":"https:\/\/engineering.lusha.com\/wp-content\/uploads\/2021\/07\/cropped-fav.png","width":512,"height":512,"caption":"Lusha"},"image":{"@id":"https:\/\/engineering.lusha.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/LushaData\/","https:\/\/x.com\/lushadata","https:\/\/www.instagram.com\/lifeatlusha","https:\/\/www.linkedin.com\/company\/lushadata","https:\/\/www.youtube.com\/channel\/UCZNqzgRZFBub6WJxeaQcxUQ"]},{"@type":"Person","@id":"https:\/\/engineering.lusha.com\/blog\/#\/schema\/person\/aa7547240f6d9c5640ddb5259e58572c","name":"Rotem Bloom","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/79f7826d26247e8fa67dde875dd55e80973c4f14bd6d9d4d430d1886b3d4c433?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/79f7826d26247e8fa67dde875dd55e80973c4f14bd6d9d4d430d1886b3d4c433?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/79f7826d26247e8fa67dde875dd55e80973c4f14bd6d9d4d430d1886b3d4c433?s=96&d=mm&r=g","caption":"Rotem Bloom"},"description":"First of all a software developer. mastering TDD and micro-service architecture.","sameAs":["https:\/\/www.lusha.com\/","https:\/\/www.linkedin.com\/in\/rotembloom\/"],"url":"https:\/\/engineering.lusha.com\/blog\/author\/rotem\/"}]}},"_links":{"self":[{"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/posts\/13507","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/users\/48"}],"replies":[{"embeddable":true,"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/comments?post=13507"}],"version-history":[{"count":10,"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/posts\/13507\/revisions"}],"predecessor-version":[{"id":13627,"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/posts\/13507\/revisions\/13627"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/media\/13588"}],"wp:attachment":[{"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/media?parent=13507"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/categories?post=13507"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/engineering.lusha.com\/blog\/wp-json\/wp\/v2\/tags?post=13507"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}