約 5,663,684 件
https://w.atwiki.jp/sitescript/pages/361.html
PORN.COM PORN.COM http //www.porn.com/ Free porn mega site; PORN.COM, worlds greatest adult video porn tube! Super fast video streams downloads of only the best hardcore sex videos the newest adult DVD s. GAY.PORN.COM http //gay.porn.com/ The GAY.PORN.COM Network is now the premier provider of Gay Sex Videos on the planet! All the top Gay Porn Stars gathered here for your viewing pleasure! スクリプトをインストールversion 0.3 2015.11.16 up 修正情報 version 0.1 2012.08.19 up version 0.1.1 2013.06.29 upサイト側仕様変更に対応。 version 0.2 2014.08.18 upURL形式変更に対応。 mobile の URL形式に対応。 HDに対応。古い取得方法を破棄。 version 0.3 2015.11.16 uup記述コード変更に対応。 Error loading player Could not load player configurationと出てきてDLできませんでした -- 名無し (2014-08-18 05 31 31) 動画の再生プレイヤーに JW Player を用いていて、ブラウザキャッシュに、前回用いた JW Player が残っている場合に、エラーが出るようです。JW Player側ないしCraving Explorer側で対応するしかないのではないかと思います。エラーが出る場合でも、最新のスクリプトを用いれば、取得そのものはできるのではないかと思います。 -- 管理人 この動画がDLできなくなっていmasu -- 名無しさん (2015-11-11 21 51 03) [URL例示 1129391] -- 名無しさん (2015-11-11 21 51 10) 名前 コメント
https://w.atwiki.jp/vocalive/pages/171.html
LIVEと行事一覧DB データベース (DataBase for Vocaloid Live Concerts and Events) メニューMENU + ←クリック目次 [←Click here for CONTENTS] ↓↓↓このページ名(Current Page Name)↓ 自動作成目次(contents) 行事紹介(Live Concert and Event Information) 【LIVE/template テンプレ-Miku LIVE】 ダイジェスト映像・動画配信・写真等(Summary Video, Live Streaming Photo) チケット情報・グッズ情報・BD/CD・その他 (Ticket Goods Information, etc.) 技術情報・出演ボーカロイド・スクリーン・MMD・3Dモデル・プロジェクター・ソフト・舞台等 (Technology・Vocaloid Name・Screen・MikuMikuDance・3D Model・Projector・Software) 出演者・製作者・関連ブログ等 (Musician, Staff, Related Blog and Credit) セットリスト(演奏曲目)・その他 (Set List, name of music) Summary in English and other language(英語等での紹介) 行事を行う団体や個人等 (Organizer and Group) スポンサー・協賛等 (Sponsor and Support) 関連行事 (Related Event Info.) 紹介記事・参考サイト・謝辞・文献等 (News, References, Acknowledgement and Credit) International News (国際ニュース) New Project and collaboration Blog Memo・メモ帳 EDIT Page 行事紹介(Live Concert and Event Information) ↓このページ名(Current Page Name)↓ 【LIVE/template テンプレ-Miku LIVE】 ここに作成したページ名を正確に同一に記入して、内部リンクにして確認ください。 Related page ■イベント名称: ■会場: ■日程: Twitter FaceBook ダイジェスト映像・動画配信・写真等(Summary Video, Live Streaming Photo) Summary video of the Live concert 2012 at Tokyo, Japan 初音ミク】「ミクの日大感謝祭」ダイジェスト映像 http //www.youtube.com/watch?v=rqg4Eun7fgs#t=325 5分25秒目からが「ミクの日大感謝祭」=325sec from the beginning is the opening. 関連映像 (Related Videos) Rehearsal (リハーサル) Miku 39's Giving Day EXTRA http //www.youtube.com/watch?v=IAyXqV0hlMw HatsuneMiku channel http //www.youtube.com/user/HatsuneMiku 【Hatsune Miku】 CONCERT!! 【初音ミク】 http //www.youtube.com/playlist?list=PL-pKPpZ1Q5NZqi-pHqc2zKXk8GG3U2fkW チケット情報・グッズ情報・BD/CD・その他 (Ticket Goods Information, etc.) チケット情報一覧 official goods http //miku.sega.jp/39/goods_list.html ニコ生 (Nico NIco Video LIVE streaming) 技術情報・出演ボーカロイド・スクリーン・MMD・3Dモデル・プロジェクター・ソフト・舞台等 (Technology・Vocaloid Name・Screen・MikuMikuDance・3D Model・Projector・Software) ライブ技術一覧 Hatsune Miku Kagamine Rin Kagamine Len Megurine Luka Meiko Kaito 技術内容が不明の部分は、空けておいて下さい。 (IF YOU DO NOT UNDERSTAND, LEAVE THE TECHNOLOGY SECTION OPEN.) Screen 投影スクリーンの種類 (Screen Type):DILAD,ディラッド・ボード又はディラッド・スクリーン、もしくは改良品 投影スクリーンの形状 (Shape of Screen): スクリーンの素材(Materials for making screen): スクリーン素材の網の目等のメッシュのサイズ:Mesh number # or Mesh size of Screen: スクリーン素材の製品カタログ番号等(Product Number of materials for making screen): スクリーン素材の透過率(%)及び色等(Light Transmittance (%) of screen, Color): 投影スクリーン等への映り込み状態(Reflection): 鮮明度 Clearness, resolution of screen: 舞台の高さ(stage height): スクリーンの高さ(Screen height): スクリーンの湾曲の程度=(映りこんだ物の歪み方の程度):very small スクリーンの大きさ又は横の長さ (Screen Size): スクリーンの継ぎ目の数(Junction within the screen)=(つないで使用された投影ボード等の枚数-1、?) :0 number of materials used for making main screen: 使用された投影ボード等の枚数(Number of board used): 音響設備及び音響状態 (Sound): プロジェクタの種類・台数 (Projector): 使用ソフト (Software): VFエンジン Virtual Fighter engine VF5 wikipedia of Virtual Fighter engine AM2 of SEGA wikipedia 3D model:ACモデル=「アケミク」=「感謝祭モデル」=arcade model (AC Model) of SEGA Producer of 3D Model:モデル製作:AM2 of SEGA http //ja.wikipedia.org/wiki/SEGA-AM2 スクリーン及び映像の解像度(Resolution of Screen and video): 投影時の色補正(Color Adjustment to view on screen): MMDのモデルの種類(Model Type of MikuMikuDance): Computer and OS: リアルタイム レンダリングの有無(real-time rendering or not) レンダリング速度等 60 fps フレーム速度 60 fps カメラ等 (Camera): 技術説明動画・写真等 (Tech Video Photo): 会場設備のホームページ (Homepage of the Event Hall): 会場設備 東京ドームシティ公式サイト MEETS PORT (ミーツポート) http //www.meetsport.jp/about/seat/1f_05.htm#zaseki その他 (others): 衣装モジュール及びデザイン (Clothing design of the models) http //miku.sega.jp/pjd2/module.html 同期システム (Synchronization system): CreativES クリエイティブス株式会社 http //creatives.jp/ 同期投影ライブシステムを考案し、企画・制作を行いました。 現在widewireworks http //widewireworks 大感謝祭 http //widewireworks.jp/works.html#miku01 黒田Pこと黒田貴泰プロデューサー(クリエイティブス株式会社)といえば、 「ミクの日感謝祭」にて「世界でも他に類を見ない「バーチャルアーティストのコンサートライブ」を制作。 同期投影ライブシステムを考案し、企画・制作を行いました」 http //vocaloid.blog120.fc2.com/blog-entry-13651.html 85-862 バンドはイヤホンしてるらしいので歓声もあまり聞こえないらしい。 ライブは自分の音源確認の為にステージモニターにイヤホン使用。 Band staff may using earphone. 選曲: 中の人1号、他 出演者・製作者・関連ブログ等 (Musician, Staff, Related Blog and Credit) 演奏者と関連ブログ 演奏:The 39s 安部潤(key、P)、黒田晃年(g)、田中晋吾(b)、折田新(ds) ストリングス 武内香澄(vln)、河本夕里安(vln)、惠藤あゆ(vla)、 さいとうひさこ(2ndvln)、今井香織(vcl cello)、上保朋子(2ndvln) 山口佳名子(2nd vln)、栗井まどか(2nd vln)、山崎明子(vcl) 管セクション 宮崎隆睦(s,fl)、田中充(tp)、池田雅明(tb)。 【初音ミク】 "アコミク" 完成音源ダイジェスト【The39s】2011年12月より発売開始 http //www.nicovideo.jp/watch/sm16040410 安部潤blogから各メンバーへのリンクをたどる http //blog.goo.ne.jp/jabe0755/e/c745caedc63022151c9f23c8b6be5836 http //blog.goo.ne.jp/jabe0755/e/fba261ecf49ad4f61c3a4d7e6ef686fb http //blog.goo.ne.jp/jabe0755/e/c35d565f1c0e3ed27b78509f0dac788c http //blog.goo.ne.jp/jabe0755/s/%A5%DF%A5%AF 折田新(ds) http //blog.goo.ne.jp/shinxdrum/e/8930b03acd91db0e1daa0ad1eda89457?fm=entry_awc 黒田晃年(g) http //akitoshikuroda.me/ 田中晋吾(b) http //homepage2.nifty.com/shingobass/index.html http //shingo.tea-nifty.com/blog/2011/07/la-76c5.html 武内香澄(vln) http //ameblo.jp/pyonnchan/entry-11035588430.html http //ameblo.jp/pyonnchan/entry-10944809273.html 河本夕里安(vln) http //ameblo.jp/yuristrawberry/entry-10945217876.html 各公演のゲスト出演Pが決定しました! セットリスト(演奏曲目)・その他 (Set List, name of music) Set List SET LIST and Producer links of Vocaloid wiki Vocaloid concert directory http //vocaloid.wikia.com/wiki/Vocaloid_concert_directory Live Events http //vocaloid.wikia.com/wiki/Live_Events http //hatsunemikusetlists.wordpress.com/ for listings that are already in English, there will be no ‘romaji’. Summary in English and other language(英語等での紹介) http //www.niconico.com/ http //live.niconico.com/ 行事を行う団体や個人等 (Organizer and Group) スポンサー・協賛等 (Sponsor and Support) 関連行事 (Related Event Info.) 紹介記事・参考サイト・謝辞・文献等 (News, References, Acknowledgement and Credit) 情報一覧MEMO International News (国際ニュース) New Project and collaboration Blog Memo・メモ帳 EDIT Page If do not know about editing web page of this Wiki, DO NOT EDIT. Click HERE to Edit Current Page or click following URL to edit this page. http //www18.atwiki.jp/vocalive/editx/27.html Make sure to rewrite correct page number after (/vocalive/editx/) or (/vocalive//editx/PAGE NUMBER.html) to edit current page. [ページ保存] button below the editing window=means SAVE the page after editing to finish editing. [プレビュー] button below the editing window=means PREVIEW the page during editing. input the code number shown to perform these command. To cancel editing, just use Web browser button out side the editing window to go back. or CLOSE the editing page of the Web browser's window. If you make mistake, DO NOT SAVE the page. Do NOT press [ページ保存] button. [» タグ ]box below the editing window=means make TAG of this page after editing. If you have any problem, insert "HELP" in the TAG to identify the page at later for repair. EDIT & MAKE Page #facebook_comment コメント・Comment 名前 コメント
https://w.atwiki.jp/memotech/pages/16.html
JDK 5.0 Update16 http //java.sun.com からbin形式でダウンロードする。 設置したい場所において実行してセットアップ。 以下のセットアップ方法のサンプルでは/usr/local/javaにセットアップする事とする。 [bose999@bose999-rhel5 local]# su - [root@bose999-rhel5 local]# cd ダウンロードしたjdk-1_5_0_16-linux-i586.binがあるフォルダ [root@bose999-rhel5 local]# mkdir /usr/local/java [root@bose999-rhel5 local]# cp ./jdk-1_5_0_16-linux-i586.bin /usr/local/java [root@bose999-rhel5 local]# chown root root /usr/local/java/jdk-1_5_0_16-linux-i586.bin [root@bose999-rhel5 local]# chmod 774 /usr/local/java/jdk-1_5_0_16-linux-i586.bin [root@bose999-rhel5 local]# /usr/local/java/jdk-1_5_0_16-linux-i586.bin セットアッププログラムがCUIベースで立ち上がるのでラインセンスを読み、yesと入力する。 [root@bose999-rhel5 local]# rm -f /usr/local/java/jdk-1_5_0_16-linux-i586.bin .bashrc等の自分が設定しているシェルの設定ファイルに JAVA_HOMEとPAHTを設定 .bashrcの設定例 export JAVA_HOME=/usr/local/java/jdk1.5.0_16 export PATH=$JAVA_HOME/bin $PATH 閲覧数: - 更新日:2009-02-24 17 17 12 (Tue) bookmark_hatena() bookmark_delicious() bookmark_livedoor() bookmark_yahoo() bookmark_nifty() technoratiに登録 Buzzurlに登録 POOKMARK Airlinesに登録 bookmark_live() link_trackback(text=トラックバック元一覧:表示する) リンク元一覧: #ref_list @めもてっく is licensed under a Creative Commons 表示 2.1 日本 License.
https://w.atwiki.jp/usb_audio/pages/33.html
原文:Audio Device Document 1.0(PDF) USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 26 · Reverb Level sets the amount of reverberant sound. · Reverb Time sets the time over which the reverberation will continue. · Reverb Delay Feedback used with Reverb Types Delay and Delay Panning. Sets the way in which delay repeats The effects of the Reverberation Processing Unit can be bypassed at all times through manipulation of the Enable Processing Control. In principle, the algorithm to produce the desired reverberation effect influences all channels as a whole. It is entirely left to the designer how a certain reverberation effect is obtained. It is not the intention of this specification to precisely define all the parameters that influence the reverberation experience (for instance in a multi-channel system, it is possible to create very similar reverberation impressions, using different algorithms and parameter settings on all channels). The symbol for the Reverberation Processing Unit can be found in the following figure ここに画像 Figure 3-9 Reverberation Processing Unit Icon 3.5.6.5 Chorus Processing Unit The Chorus Processing Unit is used to add chorus effects to the original audio information. A number of parameters can be manipulated to obtain the desired chorus effects. · Chorus Level controls the amount of the effect sound of chorus. · Chorus Modulation Rate sets the speed (frequency) of the modulator of the chorus. · Chorus Modulation Depth sets the depth at which the chorus sound is modulated. The effects of the Chorus Processing Unit can be bypassed at all times through manipulation of the Enable Processing Control. In principle, the algorithm to produce the desired chorus effect influences all channels as a whole. It is entirely left to the designer how a certain chorus effect is obtained. It is not the intention of this specification to precisely define all the parameters that influence the chorus experience. The symbol for the Chorus Processing Unit can be found in the following figure ここに画像 Figure 3-10 Chorus Processing Unit Icon 3.5.6.6 Dynamic Range Compressor Processing Unit The Dynamic Range Compressor Processing Unit is used to intelligently limit the dynamic range of the original audio information. A number of parameters can be manipulated to influence the desired compression. USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 27 ここに画像 Figure 3-11 Dynamic Range Compressor Transfer Characteristic · Compression ratio R determines the slope of the static input-to-output transfer characteristic in the compressor’s active input range. The compression is defined in terms of the compression ratio R, which is the inverse of the derivative of the output power PO as a function of the input power PI when PO and PI are expressed in dB. 数式 PR is the reference level and it is made equal to the so-called line level. All levels are expressed relative to the line level (0 dB), which is usually 15-20 dB below the maximum level. Compression is obtained when R 1, R = 1 does not affect the signal and R 1 gives rise to expansion. · Maximum Amplitude the upper boundary of the active input range, relative to the line level (0 dB). Expressed in dB. · Threshold level the lower boundary of the active input level, relative to the line level (0 dB). · Attack Time determines the response of the compressor as a function of time to a step in the input level. Expressed in ms. · Release Time relates to the recovery time of the gain of the compressor after a loud passage. Expressed in ms. The effects of the Dynamic Range Compressor Processing Unit can be bypassed at all times through manipulation of the Enable Processing Control. In principle, the algorithm to produce the desired dynamic range compression influences all channels as a whole. It is entirely left to the designer how a certain dynamic range compression is obtained. The symbol for the Dynamic Range Compressor Processing Unit can be found in the following figure ここに画像 {Figure 3-12 Dynamic Range Compressor Processing Unit Icon}} USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 28 3.5.7 Extension Unit The Extension Unit (XU) is the method provided by this specification to easily add vendor-specific building blocks to the specification. The Extension Unit provides one or more logical input channels, grouped into one or more audio channel clusters and transforms them into a number of logical output channels, grouped into one audio channel cluster. Therefore, the Extension Unit can have multiple Input Pins and has a single Output Pin. Extension Units are required to support at least the Enable Processing Control, allowing the Host software to bypass whatever functionality is incorporated in the Extension Unit. Although a generic audio driver will not be able to determine what functionality is implemented in the Extension Unit, let alone manipulate it, it still will be capable of recognizing the presence of vendorspecific extensions and assume default behavior for those units. The symbol for the Extension Unit can be found in the following figure ここに画像 {Figure 3-13 Extension Unit Icon 3.5.8 Associated Interfaces In some cases, an audio function building block (Terminal, Mixer Unit, Feature Unit, and so on) needs to be associated with interfaces that are not part of the Audio Interface Collection. As an example, consider a speaker system with front-panel volume knob. The manufacturer might want to impose a binding between the front-panel volume Control and the speaker system’s volume setting. The volume knob could be represented by a HID interface that coexists with the Audio Interface Collection. To create a binding between the Feature Unit inside the audio function that deals with master Volume Control and the frontpanel volume knob, the Feature Unit descriptor can be supplemented by a special Associated Interface descriptor that holds a link to the associated HID interface. In general, each Terminal or Unit descriptor can be supplemented by one or more optional Associated Interface descriptors that hold a reference to an interface. This interface is external to the audio function and interacts in a certain way with the Terminal or Unit. The layout of the Associated Interface descriptor is open-ended and is qualified by the Entity type it succeeds and by the target interface Class type it references. For the time being, this specification does not define any specific Associated Interface descriptor layout. 3.6 Copy Protection Because the Audio Device Class is primarily dealing with digital audio streams, the issue of protecting these – often-copyrighted – streams can not be ignored. Therefore, this specification provides the means to preserve whatever copyright information is available. However, it is the responsibility of the Host software to manage the flow of copy protection information throughout the audio function. Copy protection issues come into play whenever digital audio streams enter or leave the audio function. Therefore, the copy protection mechanism is implemented at the Terminal level in the audio function. Streams entering the audio function can be accompanied by specific information, describing the copy protection level of that audio stream. Likewise, streams leaving the audio function should be accompanied by the appropriate copy protection information, if the hardware permits it. This specification provides for two dedicated requests that can be used to manage the copy protection mechanism. The Get Copy Protect USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 29 request can be used to retrieve copy protection information from an Input Terminal whereas the Set Copy Protect request is used to preset the copy protection level of an Output Terminal. This specification provides for three levels of copy permission, similar to CGMS (Copy Generation Management System) and SCMS (Serial Copy Management System). · Level 0 Copying is permitted without restriction. The material is either not copyrighted, or the copyright is not asserted. · Level 1 One generation of copies may be made. The material is copyright protected and is the original. · Level 2 The material is copyright protected and no digital copying is permitted. 3.7 Operational Model A device can support multiple configurations. Within each configuration can be multiple interfaces, each possibly having alternate settings. These interfaces can pertain to different functions that co-reside in the same composite device. Even several independent audio functions can exist in the same device. Interfaces, belonging to the same audio function are grouped into an Audio Interface Collection. If the device contains multiple independent audio functions, there must be multiple Audio Interface Collections, each providing full access to their associated audio function. As an example of a composite device, consider a PC monitor equipped with a built-in stereo speaker system. Such a device could be configured to have one interface dealing with configuration and control of the monitor part of the device (HID Class), while a Collection of two other interfaces deals with its audio aspects. One of those, the AudioControl interface, is used to control the inner workings of the function (Volume Control etc.) whereas the other, the AudioStreaming interface, handles the data traffic, sent to the monitor’s audio subsystem. The AudioStreaming interface could be configured to operate in mono mode (alternate setting x) in which only a single channel data stream is sent to the audio function. The receiving Input Terminal could duplicate this audio stream into two logical channels, and those could then be reproduced on both speakers. From an interface point of view, such a setup requires one isochronous endpoint in the AudioStreaming interface to receive the mono audio data stream, in addition to the mandatory control endpoint and optional interrupt endpoint in the AudioControl interface. The same system could be used to play back stereo audio. In this case, the stereo AudioStreaming interface must be selected (alternate setting y). This interface also consists of a single isochronous endpoint, now receiving a data stream that interleaves left and right channel samples. The receiving Input Terminal now splits the stream into a Left and Right logical channel. The AudioControl interface remains unchanged. If the above AudioStreaming interface were an asynchronous sink, one extra isochronous synch endpoint would also be necessary. Audio Interface Collections can be dynamic. Because the AudioControl interface, together with its associated AudioStreaming interface(s), constitute the ‘logical interface’ to the audio function, they must all come into existence at the same moment in time. As stated earlier, audio functionality is located at the interface level in the device class hierarchy. The following sections describe the Audio Interface Collection, containing a single AudioControl interface and optional AudioStreaming interfaces, together with their associated endpoints that are used for audio function control and for audio data stream transfer. USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 30 3.7.1 AudioControl Interface To control the functional behavior of a particular audio function, the Host can manipulate the Units and Terminals inside the audio function. To make these objects accessible, the audio function must expose a single AudioControl interface. This interface can contain the following endpoints · A control endpoint for manipulating Unit and Terminal settings and retrieving the state of the audio function. This endpoint is mandatory, and the default endpoint 0 is used for this purpose. · An interrupt endpoint for status returns. This endpoint is optional. The AudioControl interface is the single entry point to access the internals of the audio function. All requests that are concerned with the manipulation of certain audio Controls within the audio function’s Units or Terminals must be directed to the AudioControl interface of the audio function. Likewise, all descriptors related to the internals of the audio function are part of the class-specific AudioControl interface descriptor. The AudioControl interface of an audio function may support multiple alternate settings. Alternate settings of the AudioControl interface could for instance be used to implement audio functions that support multiple topologies by presenting different class-specific AudioControl interface descriptors for each alternate setting. 3.7.1.1 Control Endpoint The audio interface class uses endpoint 0 (the default pipe) as the standard way to control the audio function using class-specific requests. These requests are always directed to one of the Units or Terminals that make up the audio function. The format and contents of these requests are detailed further in this document. 3.7.1.2 Status Interrupt Endpoint A USB AudioControl interface can support an optional interrupt endpoint to inform the Host about the status the status of the different addressable Entities (Terminals, Units, interfaces and endpoints) inside the audio function. In fact, the interrupt endpoint is used by the entire Audio Interface Collection to convey status information to the Host. It is considered part of the AudioControl interface because this is the anchor interface for the Collection. The interrupt data is a 2-byte entity. The bStatusType field contains information in D7 indicating whether there is still an interrupt pending or not. This bit remains set until all pending interrupts are properly serviced. The other bits are used to report the cause of the interrupt in more detail. Bit D6 of the bStatusType field indicates a change in memory contents on one of the addressable Entities inside the audio function. This bit is cleared by a Get Memory request on the appropriate Entity. Bits D3..0 indicate the originator of the current interrupt. All addressable Entities inside an audio function can be originator. The contents of the bOriginator field must be interpreted according to the code in D3..0 of the bStatusType field. If the originator is the AudioControl interface, the bOriginator field contains the TerminalID or UnitID of the Entity that caused the interrupt to occur. If the bOriginator field is set to zero, the ‘virtual’ Entity interface is the originator. This can be used to report global AudioControl interface changes to the Host. If the originator is an AudioStreaming interface, the bOriginator field contains the interface number of the AudioStreaming interface. Likewise, it contains the endpoint number if the originator were an AudioStreaming endpoint. The proper response to an interrupt is either a Get Status request (D6=0) or a Get Memory request (D6=1). Issuing these requests to the appropriate originator must clear the Interrupt Pending bit and the Memory Contents Changed bit, if applicable. The following table specifies the format of the status word 1 - 6 - 11 - 16 - 21 - 26 - 31 - 36 - 41 - 46 - 51 - 56 - 61 - 66 - 71 - 76 - 81 - 86 - 91 - 96 - 101 - 106 - 111 - 116 - 121 - 126 ここを編集
https://w.atwiki.jp/warband/pages/658.html
fac_no_faction|未所属 fac_commoners|平民 fac_outlaws|無法者 fac_neutral|中立 fac_innocents|無辜の民 fac_merchants|商人 fac_culture_1|ヴェイルの fac_culture_2|ドラゴンストーンの fac_culture_3|ノーヴォスの fac_culture_4|北部の fac_culture_5|リヴァーランドの fac_culture_6|リーチの fac_culture_7|ブラーヴォスの fac_culture_8|鉄諸島の fac_culture_9|ドーンの fac_culture_10|ナイツ・ウォッチの fac_culture_11|自由の民の fac_culture_12|ドスラク族の fac_culture_13|ペントスの fac_culture_14|ターガリエンの fac_culture_14_1|ターガリエンの fac_culture_15|ミアの fac_culture_16|タイロシュの fac_culture_17|ロラスの fac_culture_18|ウェスターランドの fac_culture_19|ライスの fac_culture_20|クォホールの fac_culture_21|ヴォランティスの fac_culture_22|ストームランドの fac_wild_animals|野生動物 fac_player_faction|あなたの勢力 fac_player_supporters_faction|あなたの支持者 fac_kingdom_1|ヴェイル fac_kingdom_2|ドラゴンストーン fac_kingdom_4|北部 fac_kingdom_5|リヴァーランド fac_kingdom_6|リーチ fac_kingdom_8|鉄諸島 fac_kingdom_9|ドーン fac_kingdom_18|ウェスターランド fac_kingdom_22|ストームランド fac_kingdom_14|ターガリエン家 fac_kingdom_14_1|ターガリエン家 fac_kingdom_11|自由の民 fac_kingdom_10|ナイツウォッチ fac_kingdom_12|ドスラクカラザール fac_kingdom_13|ペントス fac_kingdom_3|ノーヴォス fac_kingdom_15|ミア fac_kingdom_16|タイロシュ fac_kingdom_17|ロラス fac_kingdom_19|ライス fac_kingdom_20|クォホール fac_kingdom_21|ヴォランティス fac_kingdom_7|ブラーヴォス fac_astapor|アスタポア fac_hill_tribes|渓谷民族 fac_windblown|ウィンドブロウン fac_golden_company|ゴールデンカンパニー fac_second_sons|セカンドサンズ傭兵団 fac_stormcrows|ストームクロウズ傭兵団 fac_gallant_men|ギャラント・メン fac_iron_legionaries|アイアン軍団 fac_old_gods_of_the_forest|森の古の神々 fac_rhllor|ル=ロール fac_the_faith_of_the_seven|七神正教 fac_drowned_gods|溺神 fac_manhunters|兄弟団 fac_deserters|脱走兵 fac_ironborn_pirates|鉄人海賊 fac_forest_bandits|森賊 fac_dothraki_raiders|ドスラク族の略奪者 fac_northern_clansmen|北部山岳民族 fac_whitewalkers|ホワイトウォーカー fac_apoyoplayer|中立 fac_ccoop_all_stars|All Stars
https://w.atwiki.jp/usb_audio/pages/52.html
原文:Audio Device Document 1.0(PDF) USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 116 Offset Field Size Value Description 5 bDeviceSubClass 1 0x00 Unused. 6 bDeviceProtocol 1 0x00 Unused. 7 bMaxPacketSize0 1 0x08 8 bytes. 8 idVendor 2 0xXXXX Vendor ID. 10 idProduct 2 0xXXXX Product ID. 12 bcdDevice 2 0xXXXX Device Release Code. 14 iManufacturer 1 0x01 Index to string descriptor that contains the string Your Name in Unicode. 15 iProduct 1 0x02 Index to string descriptor that contains the string Your Product Name in Unicode. 16 iSerialNumber 1 0x00 Unused. 17 bNumConfiguration s 1 0x01 One configuration. C.3.2 Configuration Descriptor Table C-2 USB Telephone Configuration Descriptor Offset Field Size Value Description 0 bLength 1 0x09 Size of this descriptor, in bytes. 1 bDescriptorType 1 0x02 CONFIGURATION descriptor. 2 wTotalLength 2 0x00XX Length of the total configuration block, including this descriptor, in bytes. 4 bNumInterfaces 1 0x03 Three interfaces 5 bConfigurationValue 1 0x01 ID of this configuration 6 iConfiguration 1 0x00 Unused. 7 bmAttributes 1 0x60 Self Powered Remote Wakeup capable. 8 MaxPower 1 0x00 Not applicable. C.3.3 AudioControl Interface Descriptor The AudioControl interface describes the device structure and is used to manipulate the Audio Controls. USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 117 C.3.3.1 Standard AC Interface Descriptor The AudioControl interface has no dedicated endpoints associated with it. It uses the default pipe (endpoint 0) for all communication purposes. Class-specific AudioControl Requests are sent using the default pipe. There is no Status Interrupt endpoint provided. Table C-3 USB Telephone Standard AC Interface Descriptor Offset Field Size Value Description 0 bLength 1 0x09 Size of this descriptor, in bytes. 1 bDescriptorType 1 0x04 INTERFACE descriptor. 2 bInterfaceNumber 1 0x00 Index of this interface. 3 bAlternateSetting 1 0x00 Index of this setting. 4 bNumEndpoints 1 0x00 0 endpoints. 5 bInterfaceClass 1 0x01 AUDIO. 6 bInterfaceSubclass 1 0x01 AUDIO_CONTROL. 7 bInterfaceProtocol 1 0x00 Unused. 8 iInterface 1 0x00 Unused. C.3.3.2 Class-specific Interface Descriptor The Class-specific AC interface descriptor is always headed by a Header descriptor that contains general information about the AudioControl interface. It contains all the pointers needed to describe the Audio Interface Collection, associated with the described audio function. Table C-4 USB Telephone Class-specific Interface Descriptor Offset Field Size Value Description 0 bLength 1 0x0A Size of this descriptor, in bytes. 1 bDescriptorType 1 0x24 CS_INTERFACE. 2 bDescriptorSubtype 1 0x01 HEADER subtype. 3 bcdADC 2 0x0100 Revision of class specification - 1.0 5 wTotalLength 2 0x0064 Total size of class specific descriptors. 7 bInCollection 1 0x02 Number of streaming interfaces 8 baInterfaceNr(1) 1 0x01 AudioStreaming interface 1 belongs to this AudioControl interface. 9 BaInterfaceNr(2) 1 0x02 AudioStreaming interface 2 belongs to this AudioControl interface. USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 118 C.3.3.3 Input Terminal Descriptor (ID1) This descriptor describes the Input Terminal that represents the analog telephone line input. The audio channel cluster on the single Output Pin contains a single logical channel (bNrChannels=1) and there is no spatial location associated with this mono channel (wChannelConfig=0x0000). This is the input part of a bi-directional Terminal and therefore has an associated Output Terminal (ID4). Table C-5 USB Telephone Input Terminal Descriptor (ID1) Offset Field Size Value Description 0 bLength 1 0x0C Size of this descriptor, in bytes. 1 bDescriptorType 1 0x24 CS_INTERFACE. 2 bDescriptorSubtype 1 0x02 INPUT_TERMINAL subtype. 3 bTerminalID 1 0x01 ID of this Terminal. 4 wTerminalType 2 0x0501 Terminal is Phone Line In. 6 bAssocTerminal 1 0x04 Associated with Phone Line Out Terminal. 7 bNrChannels 1 0x01 One channel. 8 wChannelConfig 2 0x0000 Mono sets no position bits. 10 iChannelNames 1 0x00 Unused. 11 iTerminal 1 0x00 Unused. C.3.3.4 Input Terminal Descriptor (ID2) This descriptor describes the telephone handset input microphone. The audio channel cluster on the single Output Pin contains a single logical channel (bNrChannels=1) and there is no spatial location associated with this mono channel (wChannelConfig=0x0000). This is the input part of a bi-directional Terminal and therefore has an associated Output Terminal (ID5). Table C-6 USB Telephone Input Terminal Descriptor (ID2) Offset Field Size Value Description 0 bLength 1 0x0C Size of this descriptor, in bytes. 1 bDescriptorType 1 0x24 CS_INTERFACE. 2 bDescriptorSubtype 1 0x02 INPUT_TERMINAL subtype. 3 bTerminalID 1 0x02 ID of this Terminal. 4 wTerminalType 2 0x0401 Terminal is Handset In. 6 bAssocTerminal 1 0x05 Associated with Handset Out Terminal. USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 119 Offset Field Size Value Description 7 bNrChannels 1 0x01 One channel. 8 wChannelConfig 2 0x0000 Mono sets no position bits. 10 iChannelNames 1 0x00 Unused. 11 iTerminal 1 0x04 Unused. C.3.3.5 Input Terminal Descriptor (ID3) This descriptor describes the USB stream from the Host to the telephone set. The audio channel cluster on the single Output Pin contains a single logical channel (bNrChannels=1) and there is no spatial location associated with this mono channel (wChannelConfig=0x0000). This is the input part of a bi-directional Terminal and therefore has an associated Output Terminal (ID6). Table C-7 USB Telephone Input Terminal Descriptor (ID3) Offset Field Size Value Description 0 bLength 1 0x0C Size of this descriptor, in bytes. 1 bDescriptorType 1 0x24 CS_INTERFACE. 2 bDescriptorSubtype 1 0x02 INPUT_TERMINAL subtype. 3 bTerminalID 1 0x03 ID of this Terminal. 4 wTerminalType 2 0x0101 Terminal is USB Streaming In. 6 bAssocTerminal 1 0x06 Associated with USB Streaming out Terminal. 7 bNrChannels 1 0x01 One channel. 8 wChannelConfig 2 0x0000 Mono sets no position bits. 10 iChannelNames 1 0x00 Unused. 11 iTerminal 1 0x05 Unused. C.3.3.6 Output Terminal Descriptor (ID4) This descriptor describes the Output Terminal that represents the analog telephone line output. The audio channel cluster on the single Input Pin contains a single logical channel. This is the output part of a bi-directional Terminal and therefore has an associated Input Terminal (ID1). Table C-8 USB Telephone Output Terminal Descriptor (ID4) Offset Field Size Value Description 0 bLength 1 0x09 Size of this descriptor, in bytes. USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 120 Offset Field Size Value Description 1 bDescriptorType 1 0x24 CS_INTERFACE. 2 bDescriptorSubtype 1 0x03 OUTPUT_TERMINAL subtype. 3 bTerminalID 1 0x04 ID of this Terminal. 4 wTerminalType 2 0x0501 Terminal is Phone Line Out. 6 bAssocTerminal 1 0x01 Associated with Phone Line In Terminal. 7 bSourceID 1 0x07 From Phone Line Selector Unit. 8 iTerminal 1 0x06 Unused. C.3.3.7 Output Terminal Descriptor (ID5) This descriptor describes the telephone handset output earpiece. The audio channel cluster on the single Input Pin contains a single logical channel. This is the output part of a bi-directional Terminal and therefore has an associated Input Terminal (ID2). Table C-9 USB Telephone Output Terminal Descriptor (ID5) Offset Field Size Value Description 0 bLength 1 0x09 Size of this descriptor, in bytes. 1 bDescriptorType 1 0x24 CS_INTERFACE. 2 bDescriptorSubtype 1 0x03 OUTPUT_TERMINAL subtype. 3 bTerminalID 1 0x05 ID of this Terminal. 4 wTerminalType 2 0x0401 Terminal is Handset Out. 6 bAssocTerminal 1 0x01 Associated with Handset In Terminal. 7 bSourceID 1 0x08 From Handset Selector Unit. 8 iTerminal 1 0x00 Unused. C.3.3.8 Output Terminal Descriptor (ID6) This descriptor describes the USB stream from the telephone set to the Host. The audio channel cluster on the single Input Pin contains a single logical channel. This is the output part of a bi-directional Terminal and therefore has an associated Input Terminal (ID3). Table C-10 USB Telephone Output Terminal Descriptor (ID6) Offset Field Size Value Description 0 bLength 1 0x09 Size of this descriptor, in bytes. 1 - 6 - 11 - 16 - 21 - 26 - 31 - 36 - 41 - 46 - 51 - 56 - 61 - 66 - 71 - 76 - 81 - 86 - 91 - 96 - 101 - 106 - 111 - 116 - 121 - 126 ここを編集
https://w.atwiki.jp/api_programming/pages/192.html
下位ページ クライアントサイドアプリケーション 組込アプリ Content 基本的な流れ(Basic steps)Google API から証明書を取得する Google Authorization Server からアクセストークンを取得する アクセストークンを Google API に送る 必要に応じて、アクセストークンをリフレッシュする ウェブサーバーアプリケーション(Web server applications) 組み込みアプリケーション(Installed applications) クライアントサイドアプリケーション(Client-side (JavaScript) applications) Applications on limited-input devices Using OAuth 2.0 to Access Google APIs Google APIs は認証に Webサーバ、インストールアプリケーション、クライアントサイドも OAuth 2.0 を使う まず、 OAuth 2.0 client 証明書を Google API Console で発行してもらう。 次に、 アプリケーションからGoogle 認証サーバにアクセストークン発行を要求し、(ユーザーに)承認されるとトークンが発行される。 このトークンを使って(APIと一緒に要求して)、 Google API にアクセスする。 and sends the token to the Google API that you want to access. For an interactive demonstration of using OAuth 2.0 with Google (including the option to use your own client credentials), experiment with the OAuth 2.0 Playground. This page gives an overview of the OAuth 2.0 authorization scenarios that Google supports, and provides links to more detailed content. For details about using OAuth 2.0 for authentication, see OpenID Connect. Note Given the security implications of getting the implementation correct, we strongly encourage you to use OAuth 2.0 libraries when interacting with Google's OAuth 2.0 endpoints. It is a best practice to use well-debugged code provided by others, and it will help you protect yourself and your users. For more information, see Client libraries. 基本的な流れ(Basic steps) Oauth 2.9 を使った Google API を使用する際は、全てこのパターンに従う。 Google API から証明書を取得する Visit the Google API Console to obtain OAuth 2.0 credentials such as a client ID and client secret that are known to both Google and your application. The set of values varies based on what type of application you are building. For example, a JavaScript application does not require a secret, but a web server application does. Google Authorization Server からアクセストークンを取得する Before your application can access private data using a Google API, it must obtain an access token that grants access to that API. A single access token can grant varying degrees of access to multiple APIs. A variable parameter called scope controls the set of resources and operations that an access token permits. During the access-token request, your application sends one or more values in the scope parameter. There are several ways to make this request, and they vary based on the type of application you are building. For example, a JavaScript application might request an access token using a browser redirect to Google, while an application installed on a device that has no browser uses web service requests. Some requests require an authentication step where the user logs in with their Google account. After logging in, the user is asked whether they are willing to grant the permissions that your application is requesting. This process is called user consent. If the user grants the permission, the Google Authorization Server sends your application an access token (or an authorization code that your application can use to obtain an access token). If the user does not grant the permission, the server returns an error. It is generally a best practice to request scopes incrementally, at the time access is required, rather than up front. For example, an app that wants to support purchases should not request Google Wallet access until the user presses the “buy” button; see Incremental authorization. アクセストークンを Google API に送る After an application obtains an access token, it sends the token to a Google API in an HTTP authorization header. It is possible to send tokens as URI query-string parameters, but we don't recommend it, because URI parameters can end up in log files that are not completely secure. Also, it is good REST practice to avoid creating unnecessary URI parameter names. Access tokens are valid only for the set of operations and resources described in the scope of the token request. For example, if an access token is issued for the Google+ API, it does not grant access to the Google Contacts API. You can, however, send that access token to the Google+ API multiple times for similar operations. 必要に応じて、アクセストークンをリフレッシュする Access tokens have limited lifetimes. If your application needs access to a Google API beyond the lifetime of a single access token, it can obtain a refresh token. A refresh token allows your application to obtain new access tokens. Note Save refresh tokens in secure long-term storage and continue to use them as long as they remain valid. Limits apply to the number of refresh tokens that are issued per client-user combination, and per user across all clients, and these limits are different. If your application requests enough refresh tokens to go over one of the limits, older refresh tokens stop working. Scenarios ウェブサーバーアプリケーション(Web server applications) The Google OAuth 2.0 endpoint supports web server applications that use languages and frameworks such as PHP, Java, Python, Ruby, and ASP.NET. The authorization sequence begins when your application redirects a browser to a Google URL; the URL includes query parameters that indicate the type of access being requested. Google handles the user authentication, session selection, and user consent. The result is an authorization code, which the application can exchange for an access token and a refresh token. The application should store the refresh token for future use and use the access token to access a Google API. Once the access token expires, the application uses the refresh token to obtain a new one. Your application sends a token request to the Google Authorization Server, receives an authorization code, exchanges the code for a token, and uses the token to call a Google API endpoint. For details, see Using OAuth 2.0 for Web Server Applications. 組み込みアプリケーション(Installed applications) Google OAuth 2.0 endpoint は、PC やモバイル、タブレットのようなデバイスへの組込アプリもサポートしている。この場合、クライアント ID を作成する時に、アプリケーションタイプとして、Android, Chrome, iOS, Other を選択し、組込アプリを指定する。 クライアント ID 、クライアント secret は、アプリケーションに埋め込んで使用する。 認証作業は Google URL へのリダイレクトで始まる。URLには要求するアスセスタイプを明示するクエリパラメータを付けておく。Google にてユーザー認証を行う。これによりアプリケーションに認証コードを発行し、これをアクセストークン(とリフレッシュトークン)と交換する。 アプリケーション側で、アクセストークンを Google API へのアクセス用として、リフレッシュトークンをトークン更新用として用いる。これは、アクセストークンが失効した際に、アプリケーションはリフレッシュトークンを新しいアクセストークンとの交換に使う。 Your application sends a token request to the Google Authorization Server, receives an authorization code, exchanges the code for a token, and uses the token to call a Google API endpoint. 詳細はUsing OAuth 2.0 for Installed Applicationsへ クライアントサイドアプリケーション(Client-side (JavaScript) applications) Google OAuth 2.0 endpoint はブラウザ上で実行される JavaScript アプリケーションもサポートしている。 The authorization sequence begins when your application redirects a browser to a Google URL; the URL includes query parameters that indicate the type of access being requested. Google handles the user authentication, session selection, and user consent. The result is an access token, which the client should validate before including it in a Google API request. When the token expires, the application repeats the process. Your JS application sends a token request to the Google Authorization Server, receives a token, validates the token, and uses the token to call a Google API endpoint. 詳細はUsing OAuth 2.0 for Client-side Applications Applications on limited-input devices The Google OAuth 2.0 endpoint supports applications that run on limited-input devices such as game consoles, video cameras, and printers. The authorization sequence begins with the application making a web service request to a Google URL for an authorization code. The response contains several parameters, including a URL and a code that the application shows to the user. The user obtains the URL and code from the device, then switches to a separate device or computer with richer input capabilities. The user launches a browser, navigates to the specified URL, logs in, and enters the code. Meanwhile, the application polls a Google URL at a specified interval. After the user approves access, the response from the Google server contains an access token and refresh token. The application should store the refresh token for future use and use the access token to access a Google API. Once the access token expires, the application uses the refresh token to obtain a new one. The user logs in on a separate device that has a browser. For details, see Using OAuth 2.0 for Devices. Service accounts Google APIs such as the Prediction API and Google Cloud Storage can act on behalf of your application without accessing user information. In these situations your application needs to prove its own identity to the API, but no user consent is necessary. Similarly, in enterprise scenarios, your application can request delegated access to some resources. For these types of server-to-server interactions you need a service account, which is an account that belongs to your application instead of to an individual end-user. Your application calls Google APIs on behalf of the service account, and user consent is not required. (In non-service-account scenarios, your application calls Google APIs on behalf of end-users, and user consent is sometimes required.) Note These service-account scenarios require applications to create and cryptographically sign JSON Web Tokens (JWTs). We strongly encourage you to use a library to perform these tasks. If you write this code without using a library that abstracts token creation and signing, you might make errors that would have a severe impact on the security of your application. For a list of libraries that support this scenario, see the service-account documentation. A service account's credentials, which you obtain from the Google API Console, include a generated email address that is unique, a client ID, and at least one public/private key pair. You use the client ID and one private key to create a signed JWT and construct an access-token request in the appropriate format. Your application then sends the token request to the Google OAuth 2.0 Authorization Server, which returns an access token. The application uses the token to access a Google API. When the token expires, the application repeats the process. Your server application uses a JWT to request a token from the Google Authorization Server, then uses the token to call a Google API endpoint. No end-user is involved. For details, see the service-account documentation. Note Although you can use service accounts in applications that run from a G Suite domain, service accounts are not members of your G Suite account and aren’t subject to domain policies set by G Suite administrators. For example, a policy set in the G Suite admin console to restrict the ability of G Suite end users to share documents outside of the domain would not apply to service accounts. Token expiration You must write your code to anticipate the possibility that a granted token might no longer work. A token might stop working for one of these reasons The user has revoked access. The token has not been used for six months. The user changed passwords and the token contains Gmail scopes. The user account has exceeded a certain number of token requests. There is currently a limit of 50 refresh tokens per user account per client. If the limit is reached, creating a new token automatically invalidates the oldest token without warning. This limit does not apply to service accounts. There is also a larger limit on the total number of tokens a user account or service account can have across all clients. Most normal users won't exceed this limit but a developer's test account might. If you need to authorize multiple programs, machines, or devices, one workaround is to limit the number of clients that you authorize per user account to 15 or 20. If you are a G Suite admin, you can create additional admin users and use them to authorize some of the clients. Client libraries The following client libraries integrate with popular frameworks, which makes implementing OAuth 2.0 simpler. More features will be added to the libraries over time. Google API Client Library for Java Google API Client Library for Python Google API Client Library for Go Google API Client Library for .NET Google API Client Library for Ruby Google API Client Library for PHP Google API Client Library for JavaScript GTMAppAuth - OAuth Client Library for Mac and iOS
https://w.atwiki.jp/doroboumama/pages/5950.html
127 :名無しの心子知らず:2010/09/11(土) 18 02 51 ID i1+sPwHl 10年以上前の話で恐縮ですが投下 勤め先のロッカーの中の荷物から 5万円抜き取られたと社員Aが騒いでいた。 セキュリティがしっかりした建物だから間違いなく内部の犯行。 A曰く「訴えるつもりはない。お金を返してほしい」ということで 「名乗り出てください」とエラい人が朝礼で言うものの当然出て来ず。 ここですでに勤続10数年のBさんを皆は疑ってたそうだ。 というのも、元々仕事では人の手柄をシレーっと横取りしたり 自分の成績のためなら平気でウソをついたりしてたから。 でもBさんは結婚→妊娠→産休→復帰 と幸せ真っただ中だったし、私は「まさかー」と思ってた。 ある日突然「指紋をとります!」ということになり社員全員が呼ばれた。 その時Bさんは「修正液をこぼした」と指10本全部が真っ白…。 自白したようなもんだった。 その後エラい人に呼ばれて自主退職になったけど退職金は出ず。 たった5年で10数年分の退職金をパーにしたんだよね。 あとで聞いた話によると… ご両親が相次いで亡くなって保険金が入った時は裕福だったが 結婚相手が「ヒモ同然」で、 披露宴、新婚旅行、新居(分譲マンション)、車… 全部Bさん持ちだったらしい。 結局仕事も辞めて主夫になったために、Bさんの職場復帰も 早かったそうだ。 1人ぼっちになってしまい、寂しくて急いで新しい家族を作った感じ。 つい先日、ある大型スーパーで買い物をしてたBさんと子どもを目撃し 急に思い出したので書きました。 ちなみにBさんは美人です。 128 :名無しの心子知らず:2010/09/11(土) 18 08 38 ID xpQNyV0d 127 乙です。 たった5年で10数年分の退職金を これ、「たった5万」の間違い? 129 :名無しの心子知らず:2010/09/11(土) 18 19 20 ID i1+sPwHl マチガーーーーーーイ!!!! すみません!!! 130 :名無しの心子知らず:2010/09/11(土) 18 29 24 ID fPVFliK1 「五万節」かとオモタ 次のお話→剣道義弟嫁(204)
https://w.atwiki.jp/pqjp/pages/53.html
?xml version="1.0"? TextLibrary Text tag="[QUEST_Q0E0_ACTION]" Destroy the Catapults /Text Text tag="[QUEST_Q0E0_FAILURE]" You have failed to destroy the Catapults. /Text Text tag="[QUEST_Q0E0_KILL]" You have destroyed the Catapults. /Text Text tag="[QUEST_Q0E0_RETURN]" Receive Reward /Text Text tag="[QUEST_Q0E0_REWARD]" You are rewarded for your service in protecting Drakenburg. /Text Text tag="[QUEST_Q0E0_STEP1]" You must first go to Skelheim and destroy the Catapults. /Text Text tag="[QUEST_Q0E0_STEP2]" You must now return to Drakenburg to receive your reward. /Text /TextLibrary
https://w.atwiki.jp/hmiku/pages/21736.html
【登録タグ VOCALOID その他の文字 ドンガリンゴP 初音ミク 巡音ルカ 曲】 作詞:ドンガリンゴP 作曲:ドンガリンゴP 編曲:ドンガリンゴP 唄:巡音ルカ・初音ミク 曲紹介 取り残された者たちに捧げる。 いや、べつに未だにVOCALOID2を使ってる私の話じゃないけど。 Kinra氏 こと ドンガリンゴP の10作目。 曲のタイトルの読みは「リンボ」(ブログ説明文より) 「今回の実験の一つである...ミクとルカの声を融合させてみたのです。」(ブログ説明文より) ASCII(ANSI)コードという256色動画を軸として、PVの絵も自身で描いたもの。 ギター:和田たけあき(くらげP)ギターとピアノを絡ませた実験性に溢れるサウンド。 歌詞 (Kinra氏「深林間:倉庫」より転載) Love, riot, categorize. Shiver, slay 殻の中で 旋回。 生マレ 死ナセ 相継ぐ 崩れかけのパラダイスは今日も平和だ 生産ラインの果て 神の国へ(堕ちる) Rage, lust, rationalize. Stray, rot ケイ素循環さ。 組マレ 砕ケ 絶えなく 作りかけのパラダイムを振り出しに戻す 殻の外は夢見る約束の地 〃〃〃〃光がある世界さ 〃〃〃〃〃〃〃〃浴びれば死ぬけど う し て 僕らに与えてくれないの? こ ん な の が 生 な の ? 灰は塩に 塵は道に 作り直し 浪費は無し 死も積もれば命となる 魂など自ずと憑く 僕らが追い込んだ神話 僕らの手で糧にしたら 聖なる物 無垢なる物 殻の中の何処にあるの? 嗚呼 この場所もか planned to be left out of heaven and hell Rise, change, demystify. Take, break ゼロとイチを連打 引カレ 押サレ 続々 薄れかけのパラライシスは誰の為なの? 枷を解き自由になりたい 〃〃〃〃放たれる時を待ち倦む 〃〃〃〃〃〃〃〃〃身も砕けるの ぞ む の は こんなものか? 慈 悲 と い う 名 の 罰 か ? 産 ま れ 抱 か れ 呼 ば れ 泣 か れ 離 れ 廃 れ 汚 れ 乱 れ み 禁 じ ら れ 戒 め ら れ 称 え ら れ 伝 わ れ 謳 わ れ 焦 出 制 電 気 と 夢 と 熱 と 希 望 と 涙 の 蒸 気 圧 と 辱 が さ さ る 欲 め れ れ れ す 望 ら 破 検 複 谺 の れ れ 定 製 に 減 貶 遅 さ さ 示 数 め れ れ れ 黙 嗚呼 隔てられた 分 ら 溺 調 分 は place out of heaven and hell 裂 れ れ 整 類 限 と 蔑 懼 さ さ 極 真 ま れ れ れ の の れ 暴 作 隠 語 臨 慈 れ り 蔽 物 界 し 潰 直 さ ら か 数 関 折 回 の 悪 と 変 壊 射 放 の 愛 点 ま れ さ れ 最 適 化 さ れ 再 生 さ れ 改 名 さ れ 削 除 さ れ 壊 れ 上 書 き さ れ 圧 縮 さ れ 暗 号 化 さ れ 初 期 化 さ れ 嗚呼 いつか光は 皆殺しに来るだろう 物語 + Story Love, riot, Categorize. Shiver, slay, 殻の中で 旋回。 生マレ 死ナセ 相継ぐ 崩れかけの パラダイスは今日も平和だ 生産ラインの果て 神の国へ(堕ちる...) Rage, lust, Rationalize. Stray, rot,nケイ素循環さ。 組マレ 砕ケ 絶えなく 作りかけの パラダイムを振り出しに戻す 殻の外は 夢見る約束の地 殻の外は 光がある 世界さ 殻の外は 光がある 浴びれば死ぬけど 僕らに与えてくれないの どうして こんなのが生なの? 灰は塩に 塵は道に 作り直し 浪費は無し 死も積もれば命となる 魂など自ずと憑く 僕らが追い込んだ神話 僕らの手で糧にしたら 聖なる物 無垢なる物 殻の中の何処にあるの? 嗚呼 この場所もか planned to be left out of heaven and hell And still, we pray. ¶_ And only when every prayer failed did we begin to live. We have fed on myths that grew from the primal lie. We have made enough mistakes trying to reach heaven, but all we grasp is tears evaporated from the earth. And yet none would desert this pile of deads. For it is where we stand. It is where we live. It is where we rehearse for the stage that never comes. And we rejoice. ¶_ Rise, change, Demystify. Take, break. ゼロとイチを連打 引カレ 押サレ 続々 薄れかけの パラライシスは誰の為なの? 枷を解き 自由になりたい 枷を解き 放たれる時を待ち倦む 枷を解き 放たれる時 身も砕けるの 慈悲という名の罰か? のぞむのは こんなものか? 産まれ抱かれ呼ばれ泣かれ離れ廃れ汚れ乱れ焦がれ破れ遅れ溺れ懼れ暴れ潰れ壊れ 産み出され檢定され調整され作り直され上書きされ圧縮され暗号化され初期化され 禁じられ戒められ称えられ伝われ謳われ辱められ貶められ蔑まれ慈しまれ 禁制され複制され分類され隱蔽され最適化され再生され改名され削除され 嗚呼 隔てられた place out of heaven and hell 電気と夢と熱と希望と涙の蒸気圧と欲望の減数分裂と真の臨界点愛の放射壞変と悪の回折関数から物語の極限は默示に谺する 嗚呼 いつか光は 皆殺しに来るだろう And He said, "Let there be light." ¯ And then there were none. ¶ コメント すごい -- 名無しさん (2012-05-23 08 05 58) これは... さぞ大変だっただろう。編集乙であります(`・ω・´)ゞ -- 名無したん (2012-06-06 22 30 52) かっこいい -- 名無しさん (2013-02-03 19 05 23) かっけえ… -- 不憫 (2013-03-03 20 43 23) 名前 コメント