翻譯社

譏刺的是空氣很好 
The air was good ironically

文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()

翻譯社

沒法中斷的愛戀
I know that there's already ain't no stopping

但願你 也有相同的悸動
And I hope you feel the same翻譯社 babe (yeah)

文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()

翻譯社

 

站在出發點的天成翻譯社
No, we started

文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()

翻譯社課程名稱︰天然說話處置懲罰 課程性質︰系內選修 課程教師︰陳信希 開課學院:電資學院 開課系所︰資訊工程學系 測驗日期(年代日)︰2016/04/21 測驗時限(分鐘):180 mins 試題 : 01. Machine translation (MT) is one of practical NLP applications. The development of MT systems has a long history翻譯社 but still has space to improve. Please address two linguistic phenomena to explain why MT systems are challenging. (10pts) 02. An NLP system can be implemented in a pipeline翻譯社 including modules of morphological processing, syntactic analysis, semantic interpretation and context analysis. Please use the following news story to describe the concepts behind. You are asked to mention one task in each module. (10pts) 這場地動可能影響日相安倍晉三的施政計畫翻譯安倍十八日說,消費睡調漲的 計畫不會改變。 03. Ambiguity is inherent in natural language. Please describe why ambiguity may happen in each of the following cases. (10pts) (a) Prepositional phrase attachment. (b) Noun-noun compound. (c) Word: bass 04. Why the extraction of multiword expressions is critical for NLP applications? Please propose a method to check if an extracted multiword expression meets the non-compositionality criterion, (10pts) 05. Mutual information and likelihood ratio are commonly used to find collocations in a corpus. Please describe the ideas of these two methods. (10pts) 06. Emoticons are commonly used in social media. They can be regarded as a special vocabulary in a language. Emoticon understanding is helpful to understand the utterances in an interaction. Please propose an "emoticon" embedding approach to represent each emoticons as a vector, and find the most 5 relevant words to each emoticon. (10pts) 07. To deal with unseen n-grams, smoothing techniques are adopted in conventional language modeling approach. They are applied to n-grams to reallocate probability mass from observed n-grams to unobserved n-grams, producing better estimates for unseen data. Please show a smoothing technique for the conventional language model, and discuss why neural network language model (NNLM) can achieve better generalization for unseen n-grams. (10pts) 08. In HMM learning翻譯社 we aim at inferring the best model parameters翻譯社 given a skeletal model and an observation sequence. The following two equations are related to compute the state transition probabilities. Σ_{t=1}^{T-1} ξ_t(i翻譯社 j) \hat{a}_{ij} = --------------------------------------- Σ_{t=1}^{T-1} Σ_{j=1}^{N} ξ_t(i,j) α_t(i) a_{ij} b_j(o_{t+1}) β_{t+1}(j) ξ_t(i, j) = ----------------------------------------- α_T(q_F) Please answer the following questions. (10pts) (a) Intuitively, we can generate all possible paths for the given observation sequence, and compute total times of a transition which the observation passes. Which part in the above equations avoids the generation of all possible paths? (b) Which part in the above equations is related to prorate count to estimate the transition probability of a transition? 09. Many NLP problems can be cast as a sequence labelling problem. Part of speech tagging is a typical example. Given a model and an observation sequence, we aim at finding the most probable state sequence. Please explain why this process is called a decoding process. In addition, please give another application which can be also treated as a sequence labelling problem. (10pts) 10. What is long-distance dependencies or unbounded dependencies? Why such kinds of linguistic phenomena are challenging in NLP? (10pts) 11. Part of speech tagging can be formulated in the following two alternatives: Model 1: \hat{t}_1^n = argmax_{t_1^n} Π_{i=1}^n P(w_i|t_i) P(t_i|t_{i-1}) Model 2: \hat{t}_1^n = argnax_{t_1^n} Π_{i=1}^n P(t_i|w_i翻譯社 t_{i-1}) Please answer the following questions. (10pts) (a) Which one is discriminative model? (b) Which one can introduce more features? (c) Which one can use Viterbi algorithm to improve the speed? (d) Which one is derived on the basis of Bayes rule? 12. The following parsing tree is selected from Chinese Treebank 8.0. What NP and VP rules can be extracted from this parsing tree to form parts of a treebank grammer? (10pts) ( (IP (IP (NP-SBJ (NN 建築)) | (VP (VC 是) | | (NP-PRD (CP-APP (IP (NP-SBJ (-NONE- *pro*)) | | | | (VP (VV 開辟) | | | | | (NP-PN-OBJ (NR 浦東)))) | | | | (DEC 的)) | | | (QP (CD 一) | | | (CLP (M 項))) | | | (ADJP (JJ 主要)) | | | (NP (NN 經濟) | | | (NN 運動))))) | (PU 。) | (IP (NP-SBJ (-NONE- *pro*)) | (VP (DP-TMP (DT 這些) | | | (CLP (M 年))) | | (VP (VE 有) | | (IP-OBJ (NP-SBJ (NP (QP (CD 數百) | | | | | (CLP (M 家))) | | | | | (NP (NN 建築) | | | | | (NN 公司))) | | | | (PU 、) | | | | (NP (QP (CD 四千餘) | | | | | (CLP (M 個))) | | | | | (NP (NN 建築) | | | | | (NN 工地)))) | | | (VP (VV 遍布) | | | | (PP-LOC (P 在) | | | | | (LCP (NP (DP (DT 這) | | | | | | (CLP (M 片))) | | | | | | (NP (NN 熱土))) | | | | | (LC 上)))))))) | (PU 。)) )

文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()

翻譯社問這現在不知道有沒有違背板龜QQ 想求教一下 早早之前招待被改掉 GE放著好久沒用 前陣子電腦重灌,程式也丟了 比來又興起了養小號面連疊運的動機 重操起摹擬器 可是此次GE載好了、google也安裝 怪物彈珠卻一向啟動失敗 想請問 現在GE不克不及跑彈珠了嗎?

文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()

翻譯社大家好: 本人想徵求才藝交流對象! 【天成翻譯社】 交大外文所碩士、台大外文所博士班翻譯 英文想迥殊學哪一方面都可評論辯論(我會為翻譯公司放置教材) 小時候學過游泳…算有一點根蒂根基吧。 【你】 進展你能帶著我找轉身體中那關於游泳的記憶~而且提示我注重准確的姿式。 交換城市:台中 p.s. 若為世界健身會員佳,天成翻譯社們8月可以一路去學府店利用游泳池! 至於交流英文的城市一樣是在台中,切實地點可評論辯論。 有愛好的同夥請私訊我吧~感恩!!! -- Hermes Translation Studio 信使譯站 譯站工作者:matrixasblues 鐵道迷 / Lan / Zoe 張柔伊 Website: http://hermestranstudio.weebly.com Facebook: http://www.facebook.com/hermestranstudio

文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()

翻譯社

[雜問] 外國人翻譯姓名寫法
文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()

翻譯社

生平 History

* 他創作發明了突變的微生物,讓它們能藉由代謝傳染感動分化灰燼

文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()

翻譯社

  跟各人強調一下,路由器一般而言還有很多俗稱,如:IP分享器、寬頻分享器、寬頻路由器等等,然則其實都是指統一種工具翻譯

  以後我們將陸續講解→DHCP、IP Address、MAC Address、根基網路指令等部分,敬請等候吧!

文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()

翻譯社
垂下金色長髮

文章標籤

palmercud5i 發表在 痞客邦 留言(0) 人氣()