CINXE.COM

Temporal Attention on LSTM Layer? - General Discussion - Build with Google AI

<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <title>Temporal Attention on LSTM Layer? - General Discussion - Build with Google AI</title> <meta name="description" content="I want to implement Attention on LSTM layer. Let me put the detail description: we analyze the 8 hidden states of the LSTM that represent the embeddings for the different parts of an input frame. We consider the first 7&amp;hellip;"> <meta name="generator" content="Discourse 3.4.0.beta4-dev - https://github.com/discourse/discourse version ba1464a84eec3f092da8881e0b38f19834a9a437"> <link rel="icon" type="image/png" href="https://d3qe71uytubmmx.cloudfront.net/optimized/2X/8/8bfe64fc593eb7baba8df149ebba4457c16ca1f9_2_32x32.png"> <link rel="apple-touch-icon" type="image/png" href="https://d3qe71uytubmmx.cloudfront.net/optimized/2X/8/8bfe64fc593eb7baba8df149ebba4457c16ca1f9_2_180x180.png"> <meta name="theme-color" media="all" content="#1f1f1f"> <meta name="color-scheme" content="dark"> <meta name="viewport" content="width=device-width, initial-scale=1.0, minimum-scale=1.0, viewport-fit=cover"> <link rel="canonical" href="https://discuss.ai.google.dev/t/temporal-attention-on-lstm-layer/31042" /> <link rel="search" type="application/opensearchdescription+xml" href="https://discuss.ai.google.dev/opensearch.xml" title="Build with Google AI Search"> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/color_definitions_material-dark_7_13_89d4185b77418096b3c72d0d7c746a801ee61594.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" class="light-scheme"/> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/desktop_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="desktop" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/automation_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="automation" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/checklist_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="checklist" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-ai_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-ai" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-cakeday_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-cakeday" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-data-explorer_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-data-explorer" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-details_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-details" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-lazy-videos_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-lazy-videos" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-local-dates_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-local-dates" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-narrative-bot_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-narrative-bot" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-policy_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-policy" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-presence_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-presence" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-reactions_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-reactions" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-solved_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-solved" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-templates_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-templates" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-topic-voting_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-topic-voting" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/footnote_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="footnote" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/hosted-site_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="hosted-site" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/poll_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="poll" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/spoiler-alert_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="spoiler-alert" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-ai_desktop_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-ai_desktop" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-reactions_desktop_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-reactions_desktop" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/discourse-topic-voting_desktop_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="discourse-topic-voting_desktop" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/poll_desktop_4049e6c0cb1d693ba8577642822ea353ac2d93d5.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="poll_desktop" /> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/desktop_theme_4_ba9964201f529de7d519443b7509431b7b4ff4fd.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="desktop_theme" data-theme-id="4" data-theme-name="discourse header search"/> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/desktop_theme_3_1faab6848c9a550804b3d45949afe470cbb80fca.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="desktop_theme" data-theme-id="3" data-theme-name="discourse-material-icons"/> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/desktop_theme_13_c0e841fda07512d20142ffe0d070632c77ade70f.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="desktop_theme" data-theme-id="13" data-theme-name="google ai"/> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/desktop_theme_9_bb6f26a10ba4889745f52354a4aa9fe8f05e59d9.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="desktop_theme" data-theme-id="9" data-theme-name="global notice custom css"/> <link href="https://d1dlmcr85iqnpo.cloudfront.net/stylesheets/desktop_theme_8_ae8fb5da3d87de98751bddc380ec689fb70dbfbf.css?__ws=discuss.ai.google.dev" media="all" rel="stylesheet" data-target="desktop_theme" data-theme-id="8" data-theme-name="temp nav fixes"/> <link href="https://fonts.googleapis.com/css2?family=Material+Symbols+Outlined:opsz,wght,FILL,GRAD@20..48,100..400,0..1,0&amp;family=Roboto+Mono:ital,wght@0,100..700;1,100..700&amp;family=Roboto:ital,wght@0,100;0,300;0,400;0,500;0,700;0,900;1,100;1,300;1,400;1,500;1,700;1,900&amp;display=swap" rel="stylesheet"> <link rel="alternate nofollow" type="application/rss+xml" title="RSS feed of &#39;Temporal Attention on LSTM Layer?&#39;" href="https://discuss.ai.google.dev/t/temporal-attention-on-lstm-layer/31042.rss" /> <meta property="og:site_name" content="Build with Google AI" /> <meta property="og:type" content="website" /> <meta name="twitter:card" content="summary" /> <meta name="twitter:image" content="https://d3qe71uytubmmx.cloudfront.net/original/2X/8/8bfe64fc593eb7baba8df149ebba4457c16ca1f9.png" /> <meta property="og:image" content="https://d3qe71uytubmmx.cloudfront.net/original/2X/8/8bfe64fc593eb7baba8df149ebba4457c16ca1f9.png" /> <meta property="og:url" content="https://discuss.ai.google.dev/t/temporal-attention-on-lstm-layer/31042" /> <meta name="twitter:url" content="https://discuss.ai.google.dev/t/temporal-attention-on-lstm-layer/31042" /> <meta property="og:title" content="Temporal Attention on LSTM Layer?" /> <meta name="twitter:title" content="Temporal Attention on LSTM Layer?" /> <meta property="og:description" content="I want to implement Attention on LSTM layer. Let me put the detail description: we analyze the 8 hidden states of the LSTM that represent the embeddings for the different parts of an input frame. We consider the first 7 hidden states as the historical temporal context and learn 7 weights corresponding to these states: past context = [h1;h2;h3;:::h8] (1) current = h8 (2) transformed context = tanh(W1 脳past context + b1) (3) weights = softmax(W2 脳transformed context + b2) (4) final embe..." /> <meta name="twitter:description" content="I want to implement Attention on LSTM layer. Let me put the detail description: we analyze the 8 hidden states of the LSTM that represent the embeddings for the different parts of an input frame. We consider the first 7 hidden states as the historical temporal context and learn 7 weights corresponding to these states: past context = [h1;h2;h3;:::h8] (1) current = h8 (2) transformed context = tanh(W1 脳past context + b1) (3) weights = softmax(W2 脳transformed context + b2) (4) final embe..." /> <meta property="og:article:section" content="TensorFlow" /> <meta property="og:article:section:color" content="f66f00" /> <meta property="og:article:section" content="General Discussion" /> <meta property="og:article:section:color" content="f2a346" /> <meta property="og:article:tag" content="keras" /> <meta property="og:article:tag" content="api" /> <meta property="og:article:tag" content="help_request" /> <meta property="article:published_time" content="2021-11-05T02:54:52+00:00" /> <meta property="og:ignore_canonical" content="true" /> </head> <body class="crawler browser-update"> <header> <a href="/"> Build with Google AI </a> </header> <div id="main-outlet" class="wrap" role="main"> <div id="topic-title"> <h1> <a href="/t/temporal-attention-on-lstm-layer/31042">Temporal Attention on LSTM Layer?</a> </h1> <div class="topic-category" itemscope itemtype="http://schema.org/BreadcrumbList"> <span itemprop="itemListElement" itemscope itemtype="http://schema.org/ListItem"> <a href="/c/tensorflow/general-discussion-6/37" class="badge-wrapper bullet" itemprop="item"> <span class='badge-category-bg' style='background-color: #f66f00'></span> <span class='badge-category clear-badge'> <span class='category-name' itemprop='name'>TensorFlow</span> </span> </a> <meta itemprop="position" content="1" /> </span> <span itemprop="itemListElement" itemscope itemtype="http://schema.org/ListItem"> <a href="/c/tensorflow/general-discussion-6/37" class="badge-wrapper bullet" itemprop="item"> <span class='badge-category-bg' style='background-color: #f2a346'></span> <span class='badge-category clear-badge'> <span class='category-name' itemprop='name'>General Discussion</span> </span> </a> <meta itemprop="position" content="2" /> </span> </div> <div class="topic-category"> <div class='discourse-tags list-tags'> <a href='https://discuss.ai.google.dev/tag/keras' class='discourse-tag' rel="tag">keras</a>, <a href='https://discuss.ai.google.dev/tag/api' class='discourse-tag' rel="tag">api</a>, <a href='https://discuss.ai.google.dev/tag/help_request' class='discourse-tag' rel="tag">help_request</a> </div> </div> </div> <div itemscope itemtype='http://schema.org/DiscussionForumPosting'> <meta itemprop='headline' content='Temporal Attention on LSTM Layer?'> <link itemprop='url' href='https://discuss.ai.google.dev/t/temporal-attention-on-lstm-layer/31042'> <meta itemprop='datePublished' content='2021-11-05T02:54:51Z'> <meta itemprop='articleSection' content='General Discussion'> <meta itemprop='keywords' content='keras, api, help_request'> <div itemprop='publisher' itemscope itemtype="http://schema.org/Organization"> <meta itemprop='name' content='Build with Google AI'> <div itemprop='logo' itemscope itemtype="http://schema.org/ImageObject"> <meta itemprop='url' content='https://d3qe71uytubmmx.cloudfront.net/original/1X/9c9b743566b5ef0d0701ebba72bf83038f42ea0b.png'> </div> </div> <div id='post_1' class='topic-body crawler-post'> <div class='crawler-post-meta'> <span class="creator" itemprop="author" itemscope itemtype="http://schema.org/Person"> <a itemprop="url" rel='nofollow' href='https://discuss.ai.google.dev/u/Nafees'><span itemprop='name'>Nafees</span></a> </span> <link itemprop="mainEntityOfPage" href="https://discuss.ai.google.dev/t/temporal-attention-on-lstm-layer/31042"> <span class="crawler-post-infos"> <time datetime='2021-11-05T02:54:52Z' class='post-time'> November 5, 2021, 2:54am </time> <meta itemprop='dateModified' content='2023-10-31T06:36:04Z'> <span itemprop='position'>1</span> </span> </div> <div class='post' itemprop='text'> <p><strong>I want to implement Attention on LSTM layer. Let me put the detail description:</strong></p> <p>we analyze the 8 hidden states of the LSTM that represent the embeddings for the different parts of an input frame. We consider the first 7 hidden states as the historical temporal<br> context and learn 7 weights corresponding to these states:</p> <p>past context = [h1;h2;h3;:::h8] (1)</p> <p>current = h8 (2)</p> <p>transformed context = tanh(W1 脳past context + b1) (3)</p> <p>weights = softmax(W2 脳transformed context + b2) (4)</p> <p>final embedding = past context脳weights + current (5)</p> <p>b1 and b2 denote the biases in the two linear layers, and W1 and<br> W2 represent the 2D matrices in the linear layers. We initially<br> apply a linear transformation accompanied by a tanh linearity<br> transforming each of these seven vectors of size 128 into seven<br> new vectors of size 128 (Eq. 3). Another linear transformation<br> converts these 8 vectors each to size 1 essentially giving us<br> scores for each of the hidden states. These scores are then<br> passed through a softmax to give the final set of weights (Eq.<br> 4). These weights are used to calculate a weighted sum of all<br> the 8 hidden states to give the final embedding for the past<br> context. This past context is added to the last hidden state<br> to give the final embedding for the input frame (Eq. 5). This<br> final embedding is used for classification.</p> <p><strong>Please verify my code according to description. Is it right?</strong></p> <pre><code>from tensorflow.keras.layers import Input, Dense, Lambda, Dot, Activation, Concatenate from tensorflow.keras.layers import Layer import tensorflow as tf def attention(lstm_hidden_status): # Tensor("lstm_1/transpose_1:0", shape=(?, 8, 128), dtype=float32) hidden_size = lstm_hidden_status.get_shape().as_list() # get all dimensions all list hidden_size = int(hidden_size[2]) # 128 # feed to Forward Neural Network h_t = Lambda(lambda x: x[:, -1, :], output_shape=(hidden_size,), name='last_hidden_state')(lstm_hidden_status) # Tensor("last_hidden_state/strided_slice:0", shape=(?, 128), dtype=float32) transformed_context = Dense(hidden_size, use_bias=True, activation='tanh', name='transformed_context_vec')( lstm_hidden_status) # Tensor("transformed_context_vec/Tanh:0", shape=(?, 8, 128), dtype=float32) score = Dot(axes=[1, 2], name='attention_score')([h_t, transformed_context]) # Tensor("attention_score/Squeeze:0", shape=(?, 8), dtype=float32) attention_weights = Dense(8, use_bias=True, activation='softmax', name='attention_weight')(score) # Tensor("attention_weight/Softmax:0", shape=(?, 8), dtype=float32) context_vector = Dot(axes=[1, 1], name='context_vector')([lstm_hidden_status, attention_weights]) # Tensor("context_vector/Squeeze:0", shape=(?, 128), dtype=float32) new_context_vector = context_vector + h_t # Tensor("add:0", shape=(?, 128), dtype=float32) return new_context_vector </code></pre> <p>Specifically, I am confused here in line <code>score = Dot(axes=[1, 2], name='attention_score')([h_t, transformed_context])</code>, Why we are taking Dot product? All the debug outputs are attached with each line.</p> </div> <div itemprop="interactionStatistic" itemscope itemtype="http://schema.org/InteractionCounter"> <meta itemprop="interactionType" content="http://schema.org/LikeAction"/> <meta itemprop="userInteractionCount" content="0" /> <span class='post-likes'></span> </div> </div> <div id='post_3' itemprop='comment' itemscope itemtype='http://schema.org/Comment' class='topic-body crawler-post'> <div class='crawler-post-meta'> <span class="creator" itemprop="author" itemscope itemtype="http://schema.org/Person"> <a itemprop="url" rel='nofollow' href='https://discuss.ai.google.dev/u/aniruthraj'><span itemprop='name'>aniruthraj</span></a> </span> <span class="crawler-post-infos"> <time itemprop='datePublished' datetime='2024-11-13T18:15:26Z' class='post-time'> November 13, 2024, 6:15pm </time> <meta itemprop='dateModified' content='2024-12-05T05:18:43Z'> <span itemprop='position'>3</span> </span> </div> <div class='post' itemprop='text'> <p>Hi <a class="mention" href="/u/nafees">@Nafees</a>,</p> <p>Apologize for the delay in response.<br> Yes, it LGTM as of your requirements.The dot product between <code>h_t</code> and each column of <code>transformed_context</code> gives a measure of the relevance or importance of each previous hidden state with respect to the current input, this attention score is then used to compute a weighted sum of the previous hidden states, which is the final context vector used in the output.</p> <p>Thank You.</p> </div> <div itemprop="interactionStatistic" itemscope itemtype="http://schema.org/InteractionCounter"> <meta itemprop="interactionType" content="http://schema.org/LikeAction"/> <meta itemprop="userInteractionCount" content="0" /> <span class='post-likes'></span> </div> </div> </div> <div id="related-topics" class="more-topics__list " role="complementary" aria-labelledby="related-topics-title"> <h3 id="related-topics-title" class="more-topics__list-title"> Related topics </h3> <div class="topic-list-container" itemscope itemtype='http://schema.org/ItemList'> <meta itemprop='itemListOrder' content='http://schema.org/ItemListOrderDescending'> <table class='topic-list'> <thead> <tr> <th>Topic</th> <th></th> <th class="replies">Replies</th> <th class="views">Views</th> <th>Activity</th> </tr> </thead> <tbody> <tr class="topic-list-item" id="topic-list-item-31126"> <td class="main-link" itemprop='itemListElement' itemscope itemtype='http://schema.org/ListItem'> <meta itemprop='position' content='1'> <span class="link-top-line"> <a itemprop='url' href='https://discuss.ai.google.dev/t/how-to-apply-attention-to-blstm/31126' class='title raw-link raw-topic-link'>How to apply Attention to BLSTM?</a> </span> <div class="link-bottom-line"> <a href='/c/tensorflow/general-discussion-6/37' class='badge-wrapper bullet'> <span class='badge-category-bg' style='background-color: #f2a346'></span> <span class='badge-category clear-badge'> <span class='category-name'>General Discussion</span> </span> </a> <div class="discourse-tags"> <a href='https://discuss.ai.google.dev/tag/models' class='discourse-tag'>models</a> ,&nbsp; <a href='https://discuss.ai.google.dev/tag/keras' class='discourse-tag'>keras</a> ,&nbsp; <a href='https://discuss.ai.google.dev/tag/help_request' class='discourse-tag'>help_request</a> </div> </div> </td> <td class="replies"> <span class='posts' title='posts'>1</span> </td> <td class="views"> <span class='views' title='views'>1225</span> </td> <td> January 7, 2022 </td> </tr> <tr class="topic-list-item" id="topic-list-item-30706"> <td class="main-link" itemprop='itemListElement' itemscope itemtype='http://schema.org/ListItem'> <meta itemprop='position' content='2'> <span class="link-top-line"> <a itemprop='url' href='https://discuss.ai.google.dev/t/invalid-argument-required-broadcastable-shapes/30706' class='title raw-link raw-topic-link'>INVALID_ARGUMENT: required broadcastable shapes</a> </span> <div class="link-bottom-line"> <a href='/c/tensorflow/general-discussion-6/37' class='badge-wrapper bullet'> <span class='badge-category-bg' style='background-color: #f2a346'></span> <span class='badge-category clear-badge'> <span class='category-name'>General Discussion</span> </span> </a> <div class="discourse-tags"> <a href='https://discuss.ai.google.dev/tag/model-predict' class='discourse-tag'>model-predict</a> ,&nbsp; <a href='https://discuss.ai.google.dev/tag/autoencoder' class='discourse-tag'>autoencoder</a> </div> </div> </td> <td class="replies"> <span class='posts' title='posts'>1</span> </td> <td class="views"> <span class='views' title='views'>186</span> </td> <td> April 19, 2024 </td> </tr> <tr class="topic-list-item" id="topic-list-item-32490"> <td class="main-link" itemprop='itemListElement' itemscope itemtype='http://schema.org/ListItem'> <meta itemprop='position' content='3'> <span class="link-top-line"> <a itemprop='url' href='https://discuss.ai.google.dev/t/adding-a-transformer-layer/32490' class='title raw-link raw-topic-link'>Adding a transformer layer</a> </span> <div class="link-bottom-line"> <a href='/c/keras/22' class='badge-wrapper bullet'> <span class='badge-category-bg' style='background-color: #bf271b'></span> <span class='badge-category clear-badge'> <span class='category-name'>Keras</span> </span> </a> <div class="discourse-tags"> <a href='https://discuss.ai.google.dev/tag/models' class='discourse-tag'>models</a> ,&nbsp; <a href='https://discuss.ai.google.dev/tag/keras' class='discourse-tag'>keras</a> </div> </div> </td> <td class="replies"> <span class='posts' title='posts'>3</span> </td> <td class="views"> <span class='views' title='views'>864</span> </td> <td> June 15, 2023 </td> </tr> <tr class="topic-list-item" id="topic-list-item-30315"> <td class="main-link" itemprop='itemListElement' itemscope itemtype='http://schema.org/ListItem'> <meta itemprop='position' content='4'> <span class="link-top-line"> <a itemprop='url' href='https://discuss.ai.google.dev/t/multiheadattention-with-2-attention-axes-and-an-attention-mask-how-to-apply-mask/30315' class='title raw-link raw-topic-link'>MultiHeadAttention With 2 Attention Axes And An Attention Mask - How to apply mask</a> </span> <div class="link-bottom-line"> <a href='/c/tensorflow/general-discussion-6/37' class='badge-wrapper bullet'> <span class='badge-category-bg' style='background-color: #f2a346'></span> <span class='badge-category clear-badge'> <span class='category-name'>General Discussion</span> </span> </a> <div class="discourse-tags"> <a href='https://discuss.ai.google.dev/tag/text-vectorization' class='discourse-tag'>text-vectorization</a> ,&nbsp; <a href='https://discuss.ai.google.dev/tag/tfvectorize' class='discourse-tag'>tfvectorize</a> </div> </div> </td> <td class="replies"> <span class='posts' title='posts'>0</span> </td> <td class="views"> <span class='views' title='views'>107</span> </td> <td> April 4, 2024 </td> </tr> <tr class="topic-list-item" id="topic-list-item-30478"> <td class="main-link" itemprop='itemListElement' itemscope itemtype='http://schema.org/ListItem'> <meta itemprop='position' content='5'> <span class="link-top-line"> <a itemprop='url' href='https://discuss.ai.google.dev/t/how-to-implement-tf-keras-layers-multiheadattention/30478' class='title raw-link raw-topic-link'>How to implement tf.keras.layers.MultiHeadAttention?</a> </span> <div class="link-bottom-line"> <a href='/c/keras/22' class='badge-wrapper bullet'> <span class='badge-category-bg' style='background-color: #bf271b'></span> <span class='badge-category clear-badge'> <span class='category-name'>Keras</span> </span> </a> <div class="discourse-tags"> <a href='https://discuss.ai.google.dev/tag/help_request' class='discourse-tag'>help_request</a> ,&nbsp; <a href='https://discuss.ai.google.dev/tag/api' class='discourse-tag'>api</a> </div> </div> </td> <td class="replies"> <span class='posts' title='posts'>2</span> </td> <td class="views"> <span class='views' title='views'>4559</span> </td> <td> September 29, 2022 </td> </tr> </tbody> </table> </div> </div> </div> <footer class="container wrap"> <nav class='crawler-nav'> <ul> <li itemscope itemtype='http://schema.org/SiteNavigationElement'> <span itemprop='name'> <a href='/' itemprop="url">Home </a> </span> </li> <li itemscope itemtype='http://schema.org/SiteNavigationElement'> <span itemprop='name'> <a href='/categories' itemprop="url">Categories </a> </span> </li> <li itemscope itemtype='http://schema.org/SiteNavigationElement'> <span itemprop='name'> <a href='/guidelines' itemprop="url">Guidelines </a> </span> </li> <li itemscope itemtype='http://schema.org/SiteNavigationElement'> <span itemprop='name'> <a href='https://policies.google.com/terms' itemprop="url">Terms of Service </a> </span> </li> <li itemscope itemtype='http://schema.org/SiteNavigationElement'> <span itemprop='name'> <a href='https://policies.google.com/privacy' itemprop="url">Privacy Policy </a> </span> </li> </ul> </nav> <p class='powered-by-link'>Powered by <a href="https://www.discourse.org">Discourse</a>, best viewed with JavaScript enabled</p> </footer> <div class="buorg"><div>Unfortunately, <a href="https://www.discourse.org/faq/#browser">your browser is unsupported</a>. Please <a href="https://browsehappy.com">switch to a supported browser</a> to view rich content, log in and reply.</div></div> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10