{"id":29034,"date":"2022-04-13T07:00:00","date_gmt":"2022-04-13T14:00:00","guid":{"rendered":"https:\/\/insidebigdata.com\/?p=29034"},"modified":"2022-04-12T15:10:58","modified_gmt":"2022-04-12T22:10:58","slug":"habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training","status":"publish","type":"post","link":"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/","title":{"rendered":"Habana Labs and Hugging Face Partner to Accelerate Transformer Model Training"},"content":{"rendered":"\n<div class=\"wp-block-image is-style-default\"><figure class=\"alignright size-large\"><img decoding=\"async\" loading=\"lazy\" width=\"222\" height=\"48\" src=\"https:\/\/insidebigdata.com\/wp-content\/uploads\/2022\/04\/Habana_logo.png\" alt=\"\" class=\"wp-image-29035\" srcset=\"https:\/\/insidebigdata.com\/wp-content\/uploads\/2022\/04\/Habana_logo.png 222w, https:\/\/insidebigdata.com\/wp-content\/uploads\/2022\/04\/Habana_logo-150x32.png 150w\" sizes=\"(max-width: 222px) 100vw, 222px\" \/><\/figure><\/div>\n\n\n\n<p>Powered by deep learning, transformer models deliver state-of-the-art performance on a wide range of machine learning tasks, such as natural language processing, computer vision, speech, and more. However, training them at scale often requires a large amount of computing power, making the whole process unnecessarily long, complex, and costly.<\/p>\n\n\n\n<p><a href=\"https:\/\/habana.ai\/\" target=\"_blank\" rel=\"noreferrer noopener\">Habana\u00ae Labs<\/a>, a pioneer in high-efficiency, purpose-built deep learning processors, and Hugging Face, the home of<a href=\"https:\/\/github.com\/huggingface\/transformers\" target=\"_blank\" rel=\"noreferrer noopener\"> Transformer<\/a> models, announced that they\u2019re joining forces to make it easier and quicker to train high-quality transformer models. Thanks to the integration of Habana\u2019s<a href=\"https:\/\/habana.ai\/training-software\/\" target=\"_blank\" rel=\"noreferrer noopener\"> SynapseAI<\/a> software suite with the Hugging Face<a href=\"https:\/\/github.com\/huggingface\/optimum\" target=\"_blank\" rel=\"noreferrer noopener\"> Optimum<\/a> open-source library, data scientists and machine learning engineers can now accelerate their Transformer training jobs on Habana processors with just a few lines of code and enjoy greater productivity as well as lower training cost.<\/p>\n\n\n\n<p><a href=\"https:\/\/habana.ai\/training\/\" target=\"_blank\" rel=\"noreferrer noopener\">Habana Gaudi<\/a> training solutions, which power Amazon\u2019s EC2 DL1 instances and Supermicro\u2019s X12 Gaudi AI Training Server, deliver price\/performance up to 40% lower than comparable training solutions and enable customers to train more while spending less. The integration of ten 100 Gigabit Ethernet ports onto every Gaudi processor enables system scaling from 1 to thousands of Gaudis with ease and cost-efficiency. Habana\u2019s SynapseAI\u00ae is optimized\u2014at inception\u2014to enable Gaudi performance and usability, supports TensorFlow and PyTorch frameworks, with a focus on computer vision and natural language processing applications.&nbsp;<\/p>\n\n\n\n<p>With 60,000+ stars on Github, 30,000+ models, and millions of monthly visits, Hugging Face is one of the fastest-growing projects in open source software history, and the go-to place for the machine learning community.<\/p>\n\n\n\n<p>With its<a href=\"https:\/\/huggingface.co\/hardware\" target=\"_blank\" rel=\"noreferrer noopener\"> Hardware Partner Program<\/a>, Hugging Face provides Gaudi\u2019s advanced deep learning hardware with the ultimate Transformer toolset. This partnership will enable rapid expansion of the Habana Gaudi training transformer model library, bringing Gaudi efficiency and ease of use to a wide array of customer use cases like natural language processing, computer vision, speech, and more.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote\"><p>\u201cWe\u2019re excited to partner with Hugging Face and its many open-source developers to address the growing demand for transformer models that benefit from the efficiency, usability, and scalability of the Gaudi training platform,\u201d said Sree Ganesan, head of software product management, Habana Labs.<\/p><\/blockquote>\n\n\n\n<blockquote class=\"wp-block-quote\"><p>\u201cHabana Gaudi brings a new level of efficiency to deep learning model training, and we\u2019re super excited to make this performance easily accessible to Transformer users with minimal code changes through Optimum\u201d, said Jeff Boudier, product director at Hugging Face.<\/p><\/blockquote>\n\n\n\n<p>To learn how to get started training with Habana Gaudi, please visit <a href=\"https:\/\/developer.habana.ai\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/developer.habana.ai<\/a>.<\/p>\n\n\n\n<p><em>Sign up for the free insideBIGDATA&nbsp;<a rel=\"noreferrer noopener\" href=\"http:\/\/insidebigdata.com\/newsletter\/\" target=\"_blank\">newsletter<\/a>.<\/em><\/p>\n\n\n\n<p><em>Join us on Twitter:&nbsp;@InsideBigData1 \u2013 <a href=\"https:\/\/twitter.com\/InsideBigData1\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/twitter.com\/InsideBigData1<\/a><\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Habana\u00ae Labs, a pioneer in high-efficiency, purpose-built deep learning processors, and Hugging Face, the home of Transformer models, announced that they\u2019re joining forces to make it easier and quicker to train high-quality transformer models. Thanks to the integration of Habana\u2019s SynapseAI software suite with the Hugging Face Optimum open-source library, data scientists and machine learning engineers can now accelerate their Transformer training jobs on Habana processors with just a few lines of code and enjoy greater productivity as well as lower training cost.<\/p>\n","protected":false},"author":10513,"featured_media":29035,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"footnotes":""},"categories":[526,115,182,180,67,56,1],"tags":[437,324,264,277,1131,96],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v20.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Habana Labs and Hugging Face Partner to Accelerate Transformer Model Training - insideBIGDATA<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Habana Labs and Hugging Face Partner to Accelerate Transformer Model Training - insideBIGDATA\" \/>\n<meta property=\"og:description\" content=\"Habana\u00ae Labs, a pioneer in high-efficiency, purpose-built deep learning processors, and Hugging Face, the home of Transformer models, announced that they\u2019re joining forces to make it easier and quicker to train high-quality transformer models. Thanks to the integration of Habana\u2019s SynapseAI software suite with the Hugging Face Optimum open-source library, data scientists and machine learning engineers can now accelerate their Transformer training jobs on Habana processors with just a few lines of code and enjoy greater productivity as well as lower training cost.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/\" \/>\n<meta property=\"og:site_name\" content=\"insideBIGDATA\" \/>\n<meta property=\"article:publisher\" content=\"http:\/\/www.facebook.com\/insidebigdata\" \/>\n<meta property=\"article:published_time\" content=\"2022-04-13T14:00:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2022-04-12T22:10:58+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/insidebigdata.com\/wp-content\/uploads\/2022\/04\/Habana_logo.png\" \/>\n\t<meta property=\"og:image:width\" content=\"222\" \/>\n\t<meta property=\"og:image:height\" content=\"48\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Editorial Team\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@insideBigData\" \/>\n<meta name=\"twitter:site\" content=\"@insideBigData\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Editorial Team\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/\",\"url\":\"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/\",\"name\":\"Habana Labs and Hugging Face Partner to Accelerate Transformer Model Training - insideBIGDATA\",\"isPartOf\":{\"@id\":\"https:\/\/insidebigdata.com\/#website\"},\"datePublished\":\"2022-04-13T14:00:00+00:00\",\"dateModified\":\"2022-04-12T22:10:58+00:00\",\"author\":{\"@id\":\"https:\/\/insidebigdata.com\/#\/schema\/person\/2949e412c144601cdbcc803bd234e1b9\"},\"breadcrumb\":{\"@id\":\"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/insidebigdata.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Habana Labs and Hugging Face Partner to Accelerate Transformer Model Training\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/insidebigdata.com\/#website\",\"url\":\"https:\/\/insidebigdata.com\/\",\"name\":\"insideBIGDATA\",\"description\":\"Your Source for AI, Data Science, Deep Learning &amp; Machine Learning Strategies\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/insidebigdata.com\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/insidebigdata.com\/#\/schema\/person\/2949e412c144601cdbcc803bd234e1b9\",\"name\":\"Editorial Team\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/insidebigdata.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/e137ce7ea40e38bd4d25bb7860cfe3e4?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/e137ce7ea40e38bd4d25bb7860cfe3e4?s=96&d=mm&r=g\",\"caption\":\"Editorial Team\"},\"sameAs\":[\"http:\/\/www.insidebigdata.com\"],\"url\":\"https:\/\/insidebigdata.com\/author\/editorial\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Habana Labs and Hugging Face Partner to Accelerate Transformer Model Training - insideBIGDATA","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/","og_locale":"en_US","og_type":"article","og_title":"Habana Labs and Hugging Face Partner to Accelerate Transformer Model Training - insideBIGDATA","og_description":"Habana\u00ae Labs, a pioneer in high-efficiency, purpose-built deep learning processors, and Hugging Face, the home of Transformer models, announced that they\u2019re joining forces to make it easier and quicker to train high-quality transformer models. Thanks to the integration of Habana\u2019s SynapseAI software suite with the Hugging Face Optimum open-source library, data scientists and machine learning engineers can now accelerate their Transformer training jobs on Habana processors with just a few lines of code and enjoy greater productivity as well as lower training cost.","og_url":"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/","og_site_name":"insideBIGDATA","article_publisher":"http:\/\/www.facebook.com\/insidebigdata","article_published_time":"2022-04-13T14:00:00+00:00","article_modified_time":"2022-04-12T22:10:58+00:00","og_image":[{"width":222,"height":48,"url":"https:\/\/insidebigdata.com\/wp-content\/uploads\/2022\/04\/Habana_logo.png","type":"image\/png"}],"author":"Editorial Team","twitter_card":"summary_large_image","twitter_creator":"@insideBigData","twitter_site":"@insideBigData","twitter_misc":{"Written by":"Editorial Team","Est. reading time":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/","url":"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/","name":"Habana Labs and Hugging Face Partner to Accelerate Transformer Model Training - insideBIGDATA","isPartOf":{"@id":"https:\/\/insidebigdata.com\/#website"},"datePublished":"2022-04-13T14:00:00+00:00","dateModified":"2022-04-12T22:10:58+00:00","author":{"@id":"https:\/\/insidebigdata.com\/#\/schema\/person\/2949e412c144601cdbcc803bd234e1b9"},"breadcrumb":{"@id":"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/insidebigdata.com\/2022\/04\/13\/habana-labs-and-hugging-face-partner-to-accelerate-transformer-model-training\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/insidebigdata.com\/"},{"@type":"ListItem","position":2,"name":"Habana Labs and Hugging Face Partner to Accelerate Transformer Model Training"}]},{"@type":"WebSite","@id":"https:\/\/insidebigdata.com\/#website","url":"https:\/\/insidebigdata.com\/","name":"insideBIGDATA","description":"Your Source for AI, Data Science, Deep Learning &amp; Machine Learning Strategies","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/insidebigdata.com\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/insidebigdata.com\/#\/schema\/person\/2949e412c144601cdbcc803bd234e1b9","name":"Editorial Team","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/insidebigdata.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/e137ce7ea40e38bd4d25bb7860cfe3e4?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/e137ce7ea40e38bd4d25bb7860cfe3e4?s=96&d=mm&r=g","caption":"Editorial Team"},"sameAs":["http:\/\/www.insidebigdata.com"],"url":"https:\/\/insidebigdata.com\/author\/editorial\/"}]}},"jetpack_featured_media_url":"https:\/\/insidebigdata.com\/wp-content\/uploads\/2022\/04\/Habana_logo.png","jetpack_shortlink":"https:\/\/wp.me\/p9eA3j-7yi","jetpack-related-posts":[{"id":29306,"url":"https:\/\/insidebigdata.com\/2022\/05\/10\/intels-habana-labs-launches-second-generation-ai-processors-for-training-and-inferencing\/","url_meta":{"origin":29034,"position":0},"title":"Intel\u2019s Habana Labs Launches Second-Generation AI Processors for Training and Inferencing","date":"May 10, 2022","format":false,"excerpt":"Intel announced that Habana Labs, its data center team focused on AI deep learning processor technologies, launched its second-generation deep learning processors for training and inference: Habana\u00ae Gaudi\u00ae2 and Habana\u00ae Greco\u2122. These new processors address an industry gap by providing customers with high-performance, high-efficiency deep learning compute choices for both\u2026","rel":"","context":"In &quot;AI Deep Learning&quot;","img":{"alt_text":"","src":"https:\/\/i0.wp.com\/insidebigdata.com\/wp-content\/uploads\/2022\/05\/Habana_Mezz_card.png?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]},{"id":32878,"url":"https:\/\/insidebigdata.com\/2023\/07\/25\/video-highlights-generative-ai-with-large-language-models\/","url_meta":{"origin":29034,"position":1},"title":"Video Highlights: Generative AI with Large Language Models","date":"July 25, 2023","format":false,"excerpt":"At an unprecedented pace, Large Language Models like GPT-4 are transforming the world in general and the field of data science in particular. This two-hour training video presentation by Jon Krohn, Co-Founder and Chief Data Scientist at the machine learning company Nebula, introduces deep learning transformer architectures including LLMs.","rel":"","context":"In &quot;AI Deep Learning&quot;","img":{"alt_text":"","src":"https:\/\/i0.wp.com\/insidebigdata.com\/wp-content\/uploads\/2023\/06\/GenerativeAI_shutterstock_2313909647_special.jpg?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]},{"id":27512,"url":"https:\/\/insidebigdata.com\/2021\/10\/30\/aws-announces-general-availability-of-amazon-ec2-dl1-instances\/","url_meta":{"origin":29034,"position":2},"title":"AWS Announces General Availability of Amazon EC2 DL1 Instances","date":"October 30, 2021","format":false,"excerpt":"Amazon Web Services, Inc. (AWS), an Amazon.com, Inc. company (NASDAQ: AMZN), announced general availability of Amazon Elastic Compute Cloud (Amazon EC2) DL1 instances, a new instance type designed for training machine learning models. DL1 instances are powered by Gaudi accelerators from Habana Labs (an Intel company) to provide up to\u2026","rel":"","context":"In &quot;Big Data&quot;","img":{"alt_text":"","src":"https:\/\/i0.wp.com\/insidebigdata.com\/wp-content\/uploads\/2016\/09\/AWS_logo.png?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]},{"id":33242,"url":"https:\/\/insidebigdata.com\/2023\/08\/31\/insidebigdata-ai-news-briefs-8-31-2023\/","url_meta":{"origin":29034,"position":3},"title":"insideBIGDATA AI News Briefs \u2013 8\/31\/2023","date":"August 31, 2023","format":false,"excerpt":"Welcome insideBIGDATA AI News Briefs, our timely new feature bringing you the latest industry insights and perspectives surrounding the field of AI including deep learning, large language models, generative AI, and transformers. We\u2019re working tirelessly to dig up the most timely and curious tidbits underlying the day\u2019s most popular technologies.\u2026","rel":"","context":"In &quot;AI Deep Learning&quot;","img":{"alt_text":"","src":"https:\/\/i0.wp.com\/insidebigdata.com\/wp-content\/uploads\/2023\/07\/AI-News-Briefs-column-banner.png?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]},{"id":29230,"url":"https:\/\/insidebigdata.com\/2022\/05\/06\/research-highlights-transformer-feed-forward-layers-are-key-value-memories\/","url_meta":{"origin":29034,"position":4},"title":"Research Highlights: Transformer Feed-Forward Layers Are Key-Value Memories","date":"May 6, 2022","format":false,"excerpt":"In this regular column, we take a look at highlights for important research topics of the day for big data, data science, machine learning, AI and deep learning. It\u2019s important to keep connected with the research arm of the field in order to see where we\u2019re headed. In this edition,\u2026","rel":"","context":"In &quot;AI Deep Learning&quot;","img":{"alt_text":"","src":"https:\/\/i0.wp.com\/insidebigdata.com\/wp-content\/uploads\/2022\/05\/Research_highlights_3.png?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]},{"id":26201,"url":"https:\/\/insidebigdata.com\/2022\/11\/06\/the-move-toward-green-machine-learning\/","url_meta":{"origin":29034,"position":5},"title":"The Move Toward Green Machine Learning","date":"November 6, 2022","format":false,"excerpt":"A new study suggests tactics for machine learning engineers to cut their carbon emissions.\u00a0Led by David Patterson, researchers at Google and UC Berkeley found that AI developers can shrink a model\u2019s carbon footprint a thousand-fold by streamlining architecture, upgrading hardware, and using efficient data centers.\u00a0","rel":"","context":"In &quot;AI Deep Learning&quot;","img":{"alt_text":"","src":"https:\/\/i0.wp.com\/insidebigdata.com\/wp-content\/uploads\/2022\/11\/weather_prediction_shutterstock_1286944201.jpg?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]}],"_links":{"self":[{"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/posts\/29034"}],"collection":[{"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/users\/10513"}],"replies":[{"embeddable":true,"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/comments?post=29034"}],"version-history":[{"count":0,"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/posts\/29034\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/media\/29035"}],"wp:attachment":[{"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/media?parent=29034"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/categories?post=29034"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/insidebigdata.com\/wp-json\/wp\/v2\/tags?post=29034"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}