๊ด€๋ฆฌ ๋ฉ”๋‰ด

๋ชฉ๋ก๋ฐ€์ง‘ํ‘œํ˜„ (2)

DATA101

[NLP] Word2Vec: (1) ๊ฐœ๋…

๐Ÿ“š ๋ชฉ์ฐจ1. Word2Vec ๊ฐœ๋…2. ํฌ์†Œํ‘œํ˜„๊ณผ์˜ ์ฐจ์ด์  3. ์–ธ์–ด๋ชจ๋ธ๊ณผ์˜ ์ฐจ์ด์ 1. Word2Vec ๊ฐœ๋…Word2Vec๋Š” Word to Vector๋ผ๋Š” ์ด๋ฆ„์—์„œ ์•Œ ์ˆ˜ ์žˆ๋“ฏ์ด ๋‹จ์–ด(Word)๋ฅผ ์ปดํ“จํ„ฐ๊ฐ€ ์ดํ•ดํ•  ์ˆ˜ ์žˆ๋„๋ก ์ˆ˜์น˜ํ™”๋œ ๋ฒกํ„ฐ(Vector)๋กœ ํ‘œํ˜„ํ•˜๋Š” ๊ธฐ๋ฒ• ์ค‘ ํ•˜๋‚˜์ž…๋‹ˆ๋‹ค. ๊ตฌ์ฒด์ ์œผ๋กœ๋Š” ๋ถ„์‚ฐํ‘œํ˜„(Distributed Representation) ๊ธฐ๋ฐ˜์˜ ์›Œ๋“œ์ž„๋ฒ ๋”ฉ(Word Embedding) ๊ธฐ๋ฒ• ์ค‘ ํ•˜๋‚˜์ž…๋‹ˆ๋‹ค. ๋ถ„์‚ฐํ‘œํ˜„์ด๋ž€ ๋ถ„ํฌ๊ฐ€์„ค(Distibutional Hypothesis) ๊ฐ€์ • ํ•˜์— ์ €์ฐจ์›์— ๋‹จ์–ด ์˜๋ฏธ๋ฅผ ๋ถ„์‚ฐํ•˜์—ฌ ํ‘œํ˜„ํ•˜๋Š” ๊ธฐ๋ฒ•์ž…๋‹ˆ๋‹ค. ๋ถ„ํฌ๊ฐ€์„ค์€ "์œ ์‚ฌํ•œ ๋ฌธ๋งฅ์— ๋“ฑ์žฅํ•œ ๋‹จ์–ด๋Š” ์œ ์‚ฌํ•œ ์˜๋ฏธ๋ฅผ ๊ฐ–๋Š”๋‹ค"๋ผ๋Š” ๊ฐ€์ •์ž…๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์„œ ๋‹จ์–ด๋ฅผ ๋ฒกํ„ฐํ™”ํ•˜๋Š” ์ž‘์—…์„ ์›Œ๋“œ์ž„๋ฒ ๋”ฉ(Word Embedding)์ด๋ผ๊ณ ..

[NLP] Word Embedding์˜ ์ดํ•ด: ํฌ์†Œํ‘œํ˜„๊ณผ ๋ฐ€์ง‘ํ‘œํ˜„

๐Ÿ“š ๋ชฉ์ฐจ1. ํฌ์†Œํ‘œํ˜„(Sparse Representation) 2. ๋ฐ€์ง‘ํ‘œํ˜„(Dense Representation) 3. ์›Œ๋“œ์ž„๋ฒ ๋”ฉ(Word Embedding)๋“ค์–ด๊ฐ€๋ฉฐ์›Œ๋“œ ์ž„๋ฒ ๋”ฉ(Word Embedding)์€ ๋‹จ์–ด(Word)๋ฅผ ์ปดํ“จํ„ฐ๊ฐ€ ์ดํ•ดํ•  ์ˆ˜ ์žˆ๋„๋ก ๋ฒกํ„ฐ๋กœ ํ‘œํ˜„ํ•˜๋Š” ๊ธฐ๋ฒ• ์ค‘ ํ•˜๋‚˜์ธ๋ฐ, ํŠนํžˆ ๋ฐ€์ง‘ํ‘œํ˜„(Dense Representation) ๋ฐฉ์‹์„ ํ†ตํ•ด ํ‘œํ˜„ํ•˜๋Š” ๊ธฐ๋ฒ•์„ ๋งํ•ฉ๋‹ˆ๋‹ค. ๋ฐ€์ง‘ํ‘œํ˜„๊ณผ ๋ฐ˜๋Œ€๋˜๋Š” ๊ฐœ๋…์ด ํฌ์†Œํ‘œํ˜„(Sparse Representation)์ž…๋‹ˆ๋‹ค. ์›Œ๋“œ ์ž„๋ฒ ๋”ฉ์„ ์ดํ•ดํ•˜๊ธฐ์— ์•ž์„œ ํฌ์†Œํ‘œํ˜„๊ณผ ๋ฐ€์ง‘ํ‘œํ˜„์— ๋Œ€ํ•ด ์•Œ์•„๋ด…๋‹ˆ๋‹ค.1. ํฌ์†Œํ‘œํ˜„(Sparse Representation)ํฌ์†Œํ‘œํ˜„์€ ๋ฐ์ดํ„ฐ๋ฅผ ๋ฒกํ„ฐ ๋˜๋Š” ํ–‰๋ ฌ์„ ๊ธฐ๋ฐ˜์œผ๋กœ ์ˆ˜์น˜ํ™”ํ•˜์—ฌ ํ‘œํ˜„ํ•  ๋•Œ ๊ทนํžˆ ์ผ๋ถ€์˜ ์ธ๋ฑ์Šค๋งŒ ํŠน์ • ๊ฐ’์œผ๋กœ ํ‘œํ˜„ํ•˜๊ณ , ๋Œ€๋ถ€๋ถ„์˜ ..