当前位置:Gxlcms > Python > 使用yield可以做哪些很酷的事情?

使用yield可以做哪些很酷的事情?

时间:2021-07-01 10:21:17 帮助过:68人阅读

使用生成器(Generator)和yield可以做哪些有趣的、酷酷的、让人意想不到的事情?
不限编程语言,例如python、JavaScript 等。

回复内容:

yield 在 JavaScript 中用的最多的可能就是结合 Promise/Thunk 等实现异步操作,比如大名鼎鼎的 tj/co · GitHub,所以已经不是「让人意想不到」的东西了。
理解 Generator 的特性后,实现一个玩具版的 co 还是很简单的:
function async(generator) {
  return new Promise(function(resolve, reject) {
    var g = generator()

    function next(val) {
      var result = g.next(val)
      var value = result.value

      if (!result.done) {
        value.then(next).catch(reject)
      }
      else {
        resolve(value)
      }
    }

    next()
  })
}
最典型的不就是async/await么?


不了解yield怎么实现async/await的,用C#代码试举一例:

IEnumerable> SomeAsyncMethod()
{
  //blabla
  yield return await( asyncMethod, context );

  //blabla
  yield return await( asyncMethod, context );

  //blabla
}
可以做动画呀,效果如图:
# -*- coding: utf-8 -*-
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import math, random
# 需要安装的库:Numpy和Matplotlib,推荐直接Anaconda
fig, axes1 = plt.subplots()
# 设置坐标轴长度
axes1.set_ylim(0, 1.4)
axes1.set_xlim(0, 1*np.pi/0.01)
# 设置初始x、y数值数组
xdata = np.arange(0, 2*np.pi, 0.01)
ydata = np.sin(xdata)
# 获得线条
line, = axes1.plot(xdata)
# 毛刺倍率,从0开始增长,offset越大毛刺越大
offset = 0.0

#因为update的参数是调用函数data_gen,所以第一个默认参数不能是framenum
def update(data):
    global offset
    line.set_ydata(data)
    return line,
# 每次生成10个随机数据
# 每次变化整幅图的话,yield一个整图就行了
def data_gen():
    global offset
    while True:
        length = float(len(xdata))
        for i in range(len(xdata)):
            ydata[i]=math.sin(xdata[i])+0.2
            if i>length/18.0 and i<(length*2.7/6.0):
                ydata[i]+=offset*(random.random()-0.5)
        offset += 0.05
        #可以设置offset的最大值
        if offset>=0.5:
           offset=0.0
        yield ydata
# 配置完毕,开始播放
ani = animation.FuncAnimation(fig, update, data_gen, interval=800, repeat=True)
plt.show()
模拟离散事件,还有更简洁优雅的方式么

Overview — SimPy 3.0.8 documentation 这个问题就是给我准备的嘛

当有人声称在CPython里实现了一个沙盒的时候就可以用yield去逗他了,I was looking through the code and saw someone submitted this but didn't run it:...

酷到没工作... A Curious Course on Coroutines and Concurrency 可以写出一个并发的库
Generator Tricks for Systems Programmers 可以写个流处理框架 参见David Beazley大神几次PyCon的pdf,看完我简直是惊呆了。dabeaz.com 可以用来训练神经网络.
比如Lasagne/Lasagne · GitHub 中的一段示例代码:
def train(iter_funcs, dataset, batch_size=BATCH_SIZE):
    """Train the model with `dataset` with mini-batch training. Each
       mini-batch has `batch_size` recordings.
    """
    num_batches_train = dataset['num_examples_train'] // batch_size
    num_batches_valid = dataset['num_examples_valid'] // batch_size

    for epoch in itertools.count(1):
        batch_train_losses = []
        for b in range(num_batches_train):
            batch_train_loss = iter_funcs['train'](b)
            batch_train_losses.append(batch_train_loss)

        avg_train_loss = np.mean(batch_train_losses)

        batch_valid_losses = []
        batch_valid_accuracies = []
        for b in range(num_batches_valid):
            batch_valid_loss, batch_valid_accuracy = iter_funcs['valid'](b)
            batch_valid_losses.append(batch_valid_loss)
            batch_valid_accuracies.append(batch_valid_accuracy)

        avg_valid_loss = np.mean(batch_valid_losses)
        avg_valid_accuracy = np.mean(batch_valid_accuracies)

        yield {
            'number': epoch,
            'train_loss': avg_train_loss,
            'valid_loss': avg_valid_loss,
            'valid_accuracy': avg_valid_accuracy,
        }
tornado就是使用generator实现的协程(coroutine)模型,再配合event loop实现高并发的 使用迭代器遍历二叉树。

人气教程排行